Researchers at CAMERA, the University of Bath’s Centre for Analysis of Motion, Entertainment and Research Applications, have developed open access software that analyzes motion capture data, without using markers. They have shown the markerless system to offer clinicians, sports coaches and physiotherapists an unobtrusive way of analyzing body movements from video footage that is comparable to using markers.
Motion analysis traditionally relies on attaching light-reflective markers onto specific points on the body; the movement of these markers in 3D space is then calculated using data from an array of cameras that film the person’s movements from different angles.
Placing markers accurately on the body can be time-consuming to set up and can sometimes interfere with the person’s natural movements.
To overcome this, the team at CAMERA led by Dr. Steffi Colyer, has developed a non-invasive markerless system using computer vision and deep learning methods to measure motion by identifying body landmarks from regular 2D image data.
Using the same images to evaluate the performance of their fully automated system, they found the results were comparable to that of a traditional marker-based motion capture system. The system works on similar technology to that used by commercial systems, but is available as an open-source workflow and can be adapted to user’s needs more easily.
The team has released a unique dataset to allow other researchers to evaluate new markerless algorithms and further progress the fields of computer vision and biomechanics.
The team used an open source computer vision system, OpenPose, to estimate the position of the joints on a 2D video image of a person running, jumping and walking. They then fuse the data in 3D and input those data into open-source modeling software called OpenSim, which fits a skeleton to the joints and allows the whole body motion to be obtained.
The fully synchronized video and marker-based data used in this study, along with the code underpinning the markerless pipeline are now available and are fully described in a paper recently published in Scientific Data.
Dr. Colyer said, “The trouble with using markers is that they can be tricky to place on a participant accurately and reliably and this process can take a long time, which isn’t very practical for many participants and applications (for example elite athletes or clinical populations).
“Our markerless system estimates the joint positions from video alone without the need for any equipment to be placed on the participant or any preparation time. This opens the door for us to capture motion data more readily in settings outside of the laboratory and the outcomes for the movements we analyzed are comparable to traditionally-used techniques with markers.
“Our pipeline is open source, which means that anyone with some expertise in the area can use it for free to get movement data from normal video footage.
“This could be useful for physiotherapists, clinicians and sports trainers in a wide range of applications including sports performance and injury prevention or rehabilitation. Additionally, the accompanying data set provides the first high-quality benchmark to evaluate emerging algorithms in this rapidly evolving field.
“We have used the system to measure the biomechanics of skeleton athletes during their push-starts and we have recently taken it out on to the tennis and badminton courts to unobtrusively monitor how much work the players are performing during training and match play.”
More information:
Murray Evans et al, Synchronised Video, Motion Capture and Force Plate Dataset for Validating Markerless Human Movement Analysis, Scientific Data (2024). DOI: 10.1038/s41597-024-04077-3
University of Bath
Citation:
Markerless motion capture system opens up biomechanics for a wide range of fields (2024, December 4)
retrieved 4 December 2024
from https://techxplore.com/news/2024-12-markerless-motion-capture-biomechanics-wide.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.