Alex Naka
banner
gottapatchemall.bsky.social
Alex Naka
@gottapatchemall.bsky.social
ML & protein engineering at http://science.xyz | into papers, pens, plots, puppies, proteins

algorithmic.ink
https://x.com/gottapatchemall
thanks!!
October 1, 2025 at 5:23 PM
Reposted by Alex Naka
I LOVE their curtain test!!
September 18, 2025 at 3:53 AM
Combining with something like a commercially available AR headset is also probably a good direction - something patient friendly that still allows control over statistics of visual input
September 18, 2025 at 5:41 PM
Great question, the short answer is no I do not.

There are some avenues here though e.g. coexpress with another less sensitive channelrhodopsin. And I guess if you have to choose between attenuating or amplifying light from the environment, it’s easier to attenuate
September 18, 2025 at 5:39 PM
But it might be something to do with our transfection protocol - we’ve been hesitant to put too much time into optimizing there since there are many degrees of freedom and we knew just using AAV would solve things
September 18, 2025 at 5:32 PM
Thanks! I’m fortunate to work with people who will basically run through walls to get data

Re AAV vs lipofection - best guess is that it’s to do with higher and more variable copy number from lipofection; seems like it’s easy to make cells unhappy when lipofecting certain opsins in
September 18, 2025 at 5:30 PM
Tremendous effort by some very talented people: Amanda Tose, Alberto Nava, Sara McGrath, and @mardinly, among many others

This lowercase science is brought to you by uppercase Science
science.xyz/news/new-fr...
Opening New Frontiers in Optogenetic Research, Sensitive Proteins Respond to Ambient Light | Science Corporation
Science Corporation is a clinical-stage medical technology company.
science.xyz
September 18, 2025 at 2:06 PM
We have a ton of fun details and other results that didn't make it into here, but we'll be dropping more soon.

Please send thoughts/feedback. And holler at us if you want to try WAChRs out for your own experiments, we'll send you some
September 18, 2025 at 2:06 PM
We also did a low-tech experiment we call the curtain test. It's basically peekaboo: cover the rig with a curtain for a while, then open it again. WAChRs respond pretty well.

Simple - but a sanity check for being able to do something like optogenetic vision restoration.
September 18, 2025 at 2:06 PM
In particular, WAChRs can drive decent photocurrents with very low intensity light.
September 18, 2025 at 2:06 PM
We picked three finalists that offer a trade-off between speed and sensitivity: WAChR-f/m/s

We benchmarked these against some existing optogenetic tools on our manual patch clamp rig ("Patchrig Swayze").

We think they offer superior performance for a lot of applications.
September 18, 2025 at 2:06 PM
We kept Mr. Patchino very busy while we iteratively searched for improvements - several hundred new WAChR variants for this project (and 1750+ so far in our broader campaign).
September 18, 2025 at 2:06 PM
We ran an ML-guided directed evolution campaign to optimize WAChR. To do this, we developed a system that uses automated patch clamp to quickly screen opsin functional properties.

(We named him Al Patchino)
September 18, 2025 at 2:06 PM
We used the K+ selective opsin WiChR as a starting point pmc.ncbi.nlm.nih.gov/articles/PM...

WiChR is a strong, sensitive optogenetic silencer. We applied a mutation that breaks K+ selectivity, which converts it into an excitatory channel. We call this mutant WAChR (pronounced "watcher").
September 18, 2025 at 2:06 PM
Preprint here
www.biorxiv.org/content/10....

Short thread below
September 18, 2025 at 2:06 PM