학술논문

OpenCap: Human movement dynamics from smartphone videos.
Document Type
Article
Source
PLoS Computational Biology. 10/19/2023, Vol. 19 Issue 10, p1-26. 26p. 1 Color Photograph, 1 Chart, 5 Graphs.
Subject
*HUMAN mechanics
*DEEP learning
*JOINTS (Anatomy)
*COMPUTER vision
*MEDICAL screening
*COMPUTING platforms
Language
ISSN
1553-734X
Abstract
Measures of human movement dynamics can predict outcomes like injury risk or musculoskeletal disease progression. However, these measures are rarely quantified in large-scale research studies or clinical practice due to the prohibitive cost, time, and expertise required. Here we present and validate OpenCap, an open-source platform for computing both the kinematics (i.e., motion) and dynamics (i.e., forces) of human movement using videos captured from two or more smartphones. OpenCap leverages pose estimation algorithms to identify body landmarks from videos; deep learning and biomechanical models to estimate three-dimensional kinematics; and physics-based simulations to estimate muscle activations and musculoskeletal dynamics. OpenCap's web application enables users to collect synchronous videos and visualize movement data that is automatically processed in the cloud, thereby eliminating the need for specialized hardware, software, and expertise. We show that OpenCap accurately predicts dynamic measures, like muscle activations, joint loads, and joint moments, which can be used to screen for disease risk, evaluate intervention efficacy, assess between-group movement differences, and inform rehabilitation decisions. Additionally, we demonstrate OpenCap's practical utility through a 100-subject field study, where a clinician using OpenCap estimated musculoskeletal dynamics 25 times faster than a laboratory-based approach at less than 1% of the cost. By democratizing access to human movement analysis, OpenCap can accelerate the incorporation of biomechanical metrics into large-scale research studies, clinical trials, and clinical practice. Author summary: Analyzing how humans move, how we coordinate our muscles, and what forces act on the musculoskeletal system is important for studying neuro-musculoskeletal conditions. Traditionally, measuring these quantities requires expensive laboratory equipment, a trained expert, and hours of analysis. Thus, high-quality measures of human movement are rarely incorporated into clinical practice and large-scale research studies. The advent of computer vision methods for locating human joints from standard videos offers a promising alternative to laboratory-based movement analysis. However, it is unclear whether these methods provide sufficient information for informing biomedical research and clinical practice. Here, we introduce OpenCap, an open-source, web-based software tool for computing the motion (e.g., joint angles) and the musculoskeletal forces underlying human movement (e.g., joint forces) from smartphone videos. OpenCap combines advances in computer vision, machine learning, and musculoskeletal simulation to make movement analysis widely available without specialized hardware, software, or expertise. We validate OpenCap against laboratory-based measurements and show its usefulness for applications including screening for disease risk, evaluating intervention efficacy, and informing rehabilitation decisions. Finally, we highlight how OpenCap enables large-scale human studies of human movement in real-world settings. [ABSTRACT FROM AUTHOR]