We develop software for analyzing human movement from video. This area integrates computer vision, biomechanical simulation, deep learning, and software development. Our main tool, OpenCap, estimates both kinematics and kinetics (e.g., ground and muscle forces) from smartphone videos and allows us to collect large-scale datasets a matter of days—which used to take years in a lab. OpenCap is used by thousands of research labs around the world.
Ongoing projects include both technical development and using OpenCap to improve clinical research and care. For example we are creating movement biomarkers for the study and treatment of movement-related conditions, like myotonic dystrophy.
Representative Publications
Uhlrich, S.D.*, Falisse, A.*, Kidziński, L.*, Muccini, J., Ko, M., Chaudhari, A.S., Hicks, J.L., Delp, S.L. OpenCap: Human movement dynamics from smartphone videos. PLoS Computational Biology 19(10), 2023. *contributed equally. Download PDF, Video Abstract, Stanford News, National News
Boswell M.A.*, Uhlrich S.D.*, Kidziński L., Thomas K., Kolesar J.A., Gold G.E., Beaupre G.S., Delp S.L., 2021. A neural network to predict the knee adduction moment in patients with osteoarthritis using anatomical landmarks obtainable from 2D video analysis. Osteoarthritis and Cartilage 29(3):346-356. *contributed equally. Download PDF
Boswell, M.A., Kidziński, Ł., Hicks, J.L., Uhlrich, S.D., Falisse, A., Delp, S.L., 2023. Smartphone videos of the sit-to-stand test predict osteoarthritis and health outcomes in a nationwide study. npj Digital Medicine, 6 (32) pg. 1-7, 2023. Download PDF