Elizabeth Wood, PhD
@lizbwood.bsky.social
Founder & CEO @jura.bsky.social | Full-stack probabilistic machine learning for the development of genetic medicines | NYC & Basel & Boston
This is an interesting case because by all rights, Sasha Rudensky could have easily been named and won as well; I would be curious as to what Brunkow would say if offered (theoretically) to trade the award for access to a career like Rudensky's, and vice-versa!
October 8, 2025 at 1:20 PM
This is an interesting case because by all rights, Sasha Rudensky could have easily been named and won as well; I would be curious as to what Brunkow would say if offered (theoretically) to trade the award for access to a career like Rudensky's, and vice-versa!
All scientists! But yes, women scientists, too. Ramsdell, who was outside of academia like Brunkow, got to serve as the CSO of Sonoma; Brunkow was a science writer for a while and then a program manager.
October 8, 2025 at 1:15 PM
All scientists! But yes, women scientists, too. Ramsdell, who was outside of academia like Brunkow, got to serve as the CSO of Sonoma; Brunkow was a science writer for a while and then a program manager.
Could it be that it’s so close to translation? There’s an overwhelming feeling of responsibility to stay in, working on what’s important (curing disease, etc.)
October 6, 2025 at 5:18 PM
Could it be that it’s so close to translation? There’s an overwhelming feeling of responsibility to stay in, working on what’s important (curing disease, etc.)
I’ll delete this slightly snide comment!
October 6, 2025 at 5:02 PM
I’ll delete this slightly snide comment!
I’d find it more heartening if they hadn’t *wanted* to stay in academia: Kariko famously so, but Bruknow too. Lee managed it, but as a senior scientist in her husband and coauthor’s lab.
October 6, 2025 at 4:46 PM
I’d find it more heartening if they hadn’t *wanted* to stay in academia: Kariko famously so, but Bruknow too. Lee managed it, but as a senior scientist in her husband and coauthor’s lab.
I know that these sorts of stories, where recognition comes in the end, are supposed to provide a certain kind of told-you-so satisfaction and inspiration; but I'm sure I (and most) would trade it for 20-40 years of productive work life of controlling and shaping a research program.
October 6, 2025 at 10:34 AM
I know that these sorts of stories, where recognition comes in the end, are supposed to provide a certain kind of told-you-so satisfaction and inspiration; but I'm sure I (and most) would trade it for 20-40 years of productive work life of controlling and shaping a research program.
I was thinking this too! If anyone on here has a warm intro for me... I reached out today to a few people there, but just on LinkedIn.
September 25, 2025 at 6:35 PM
I was thinking this too! If anyone on here has a warm intro for me... I reached out today to a few people there, but just on LinkedIn.
You can read more in our post at www.jurabio.com/blog/leavs; preprint forthcoming.
@jura.bsky.social @eliweinstein.bsky.social @mgollub.bsky.social @highvariance.bsky.social
@jura.bsky.social @eliweinstein.bsky.social @mgollub.bsky.social @highvariance.bsky.social
LeaVS: Accelerating learning for biological AI — JURA Bio, Inc.
A fundamental lesson of modern AI is that scale is essential: training bigger models on bigger datasets unlocks new capabilities. A fundamental lesson of AI engineering is that scaling up isn't trivia...
www.jurabio.com
September 22, 2025 at 12:15 PM
You can read more in our post at www.jurabio.com/blog/leavs; preprint forthcoming.
@jura.bsky.social @eliweinstein.bsky.social @mgollub.bsky.social @highvariance.bsky.social
@jura.bsky.social @eliweinstein.bsky.social @mgollub.bsky.social @highvariance.bsky.social
Today we introduce LeaVS (Learning from Variational Synthesis), a system for accelerating AI training on this functional data. LeaVS goes beyond conventional model training procedures by exploiting knowledge of the underlying synthetic library.
The result? A dramatic speedup in learning.
The result? A dramatic speedup in learning.
September 22, 2025 at 12:15 PM
Today we introduce LeaVS (Learning from Variational Synthesis), a system for accelerating AI training on this functional data. LeaVS goes beyond conventional model training procedures by exploiting knowledge of the underlying synthetic library.
The result? A dramatic speedup in learning.
The result? A dramatic speedup in learning.
Instead, it requires changes to how we design algorithms, allocate resources, and engineer systems. It requires carefully identifying and removing bottlenecks on learning, across the entire training stack.
September 22, 2025 at 12:15 PM
Instead, it requires changes to how we design algorithms, allocate resources, and engineer systems. It requires carefully identifying and removing bottlenecks on learning, across the entire training stack.
This is what's so transformative about variational-synthesis driven discovery; we don't have to learn from positive results only as we have the underlying generative model sequences in addition to the positive hits.
July 22, 2025 at 2:32 PM
This is what's so transformative about variational-synthesis driven discovery; we don't have to learn from positive results only as we have the underlying generative model sequences in addition to the positive hits.
You can read the full technical note at: jurabio.com/blog/vista
We're on Bluesky at @jura.bsky.social. My thanks for reposting widely for reach!
8/8
We're on Bluesky at @jura.bsky.social. My thanks for reposting widely for reach!
8/8
VISTA: high quality, AI-ready functional biological data loops at massive scale — JURA Bio, Inc.
AI holds tremendous potential for advancing drug design and human therapeutics. Generative models provide tools to design and sculpt proteins and other molecules. But existing systems are bottlenecked...
jurabio.com
July 22, 2025 at 12:30 PM
You can read the full technical note at: jurabio.com/blog/vista
We're on Bluesky at @jura.bsky.social. My thanks for reposting widely for reach!
8/8
We're on Bluesky at @jura.bsky.social. My thanks for reposting widely for reach!
8/8