R. James Cotton
banner
peabody124.bsky.social
R. James Cotton
@peabody124.bsky.social
Physiatrist, neuroscientist, gait and movement, brain and robotics enthusiast. Assistant Professor at Northwestern and Shirley Ryan AbilityLab
At #SfN and want to hear about how BiomechGPT can provide a language interface to your biomechanics data? Checkout poster happening now at YY4 with Ruize Yang and @antihebbiann.bsky.social
November 17, 2025 at 10:21 PM
If you are at @acrmrehab.bsky.social I will be presenting this morning (8am)

SPECIAL SYMPOSIUM: Ai-powered Movement Analysis and a Causal Framework for Precision Rehabilitation SS1 #ACRM2025 cdmcd.co/5npwMR
October 28, 2025 at 11:22 AM
More demos and code available at intelligentsensingandrehabilitation.github.io/MonocularBio...

JD did a great job creating a Gradio demo so try it out and let us know what you think

And here is a video of JD going on a celebratory run that the preprint is out :)
July 14, 2025 at 6:08 AM
Excitingly, in addition to producing accurate kinematics, we can measure the gait deviation index from these videos. We find this is quite sensitive to a number of different clinical backgrounds, and even more responsive after neurosurgical interventions than the standard clincial outcomes (mJOA).
July 14, 2025 at 6:08 AM
Central to this was extending our end-to-end differentiable biomechanics approach to fitting both 2D and 3D keypoints measured from images. This can also account for smartphone rotation measured by our Portable Biomechanics Platform, which also makes this easy to integrate into clinical workflows.
July 14, 2025 at 6:08 AM
We developed a novel approach to fitting biomechanics from smartphone video that produces kinematic reconstructions within a few degrees and has been validated across a wide range of activities and clinical backgrounds.
July 14, 2025 at 6:08 AM
June 4, 2025 at 11:01 PM
The next paper is BiomechGPT arxiv.org/abs/2505.18465 with @antihebbiann.bsky.social and Ruize Yang which trains a language model to be fluent in tokenized movement sequences. This draws inspiration from MotionGPT but focuses on benchmarking performance on clinically meaingful tasks.
May 27, 2025 at 9:36 PM
Here is another example. It also captures some imperfections like little foot slips we want to improve.
May 27, 2025 at 9:36 PM
Since then, we've tuned it up to handle anthropomorphic and muscle scaling. Still lots of work to do further tuning this as there are many things we aren't scaling such as mass and inertia and optimizing w.r.t. the EMG data we have from our wearable sensors.
May 27, 2025 at 9:36 PM
@jdpeiffer.bsky.social and Tim Unger also won second prize for the ICORR talks for "Differentiable Biomechanics for Markerless Motion Capture in Upper Limb Stroke Rehabilitation: A Comparison with Optical Motion Capture" arxiv.org/abs/2411.14992.
May 27, 2025 at 1:15 PM
With Kyle Embry and the @abilitylab.bsky.social C-STAR team we also organized a half-day workshop "From Motion to Meaning: AI-Enabled Biomechanics for Rehabilitation" showcasing work from Georgios Pavlakos, Eni Halilaj, Vikash Kumar, Chris Awai, and Pouyan Firouzabadi.
May 27, 2025 at 1:15 PM
With Dailyn Despradel, Derek Kamper, @marcslutzky.bsky.social, and @dougweberlab.bsky.social we organized a workshop on EMG biofeedback. Very much enjoyed the engaged discussion on how to disseminate these technologies into the real world.
May 27, 2025 at 1:15 PM
Looking forward to presenting on what we can do with large-scale biomechanics data in rehabilitation in the SoMNiR #RehabWeek 2025 session this afternoon! @abilitylab.bsky.social
May 15, 2025 at 5:39 PM
@jdpeiffer.bsky.social and Tim Unger gave a great talk in the ICORR best student paper session on their work using markerless motion capture to track arm kinematics of people with stroke. arxiv.org/abs/2411.14992
May 14, 2025 at 11:34 PM
@ben-petrie.bsky.social gave a great #physiatry2025 presentation on EMG biofeedback for people with spinal cord injury and won first prize! Thanks to the Craig H. Neilsen Foundation for support for this project, and to all of our participants.

Excited to share this soon in an upcoming manuscript!
March 1, 2025 at 11:50 PM
February 27, 2025 at 1:27 AM
We already do, although currently just for the bilevel scaling and marker offsets that we optimize jointly with the inverse kinematics (arxiv.org/abs/2402.17192)

Starting to think about scaling for muscles, which we need for some of our new stuff to get rid of the scaling inconsistency.
February 12, 2025 at 7:23 PM
Another fun one. In realtime (instead of the 60 fps -> 30 fps in the prior)
February 3, 2025 at 6:04 PM
Super stoked about the progress @jdpeiffer.bsky.social is making on monocular biomechanical analysis. Also really fun to connect back to my old drone hobby (and they are so much easier now than a decade ago buildandcrash.blogspot.com/2014/10/refl...)
February 3, 2025 at 6:03 PM
And Jose Pons gave a great talk on electrical stimulation in rehabilitation for stroke, SCI and PD.
November 15, 2024 at 7:17 PM
Also, I love this speaker gift! Definitely a solid likeness. And also amazing to have the meeting by a 500+ year old castle!
November 15, 2024 at 7:16 PM
I enjoyed presenting at the ICOT meeting in Gran Cararia www.jornadasicotdca.com. Very fun learning and hearing more about international rehabilitation and seeing the vibrant community they have here!
November 15, 2024 at 7:14 PM
I enjoyed speaking about this framework in my #ICNR2024 plenary and the amount of positive feedback about it. Looking forward to future interactions!
November 12, 2024 at 9:07 AM
While rehabilitation is moving towards an era of big data, we still lack a framework to analyze this data to improve outcomes. We took a stab at this, which we call a "Causal Framework for Precision Rehabilitation." arxiv.org/abs/2411.03919
November 12, 2024 at 9:05 AM