Moon
banner
masayukinagai.bsky.social
Moon
@masayukinagai.bsky.social
PhD student @ Koo lab, CSHL | Computational Biology | AI Alignment
Makes me think it would be more fruitful to focus more on robust model training to achieve smoother sequence-to-function landscapes in non-OOD space
April 17, 2025 at 2:48 AM
Thanks for sharing the insights on smoother pooling operations, gives me a lot to think about! And I generally agree about the filter interpretation limitations. Going over various attribution methods, mostly developed in non-biology contexts, it's definitely tricky to gauge their reliability.
April 17, 2025 at 2:48 AM
(And speaking as a student in Peter Koo's lab...) pooling in early layers could help enforce initial conv filters to capture more complete pattners rather than partial ones.

(But I like the ChromBPNet architecture, and the separation of bias signal in the PISA paper was beautiful!)
April 17, 2025 at 1:43 AM
I agree it's more of a workaround for pooling architectures than a complete solution. But I'm not sure how to achieve long receptive fields without overly parameterizing the model if not using pooling.
April 17, 2025 at 1:43 AM
These strategies reduce technical variance and improve eQTL benchmarking, but substantial variance remains, pointing to the need for boundary-free architectures or more robust training 2/2

Full read: www.biorxiv.org/content/10.1...
Shift augmentation improves DNA convolutional neural network indel effect predictions
Determining genetic variant effects on molecular phenotypes like gene expression is a task of paramount importance to medical genetics. DNA convolutional neural networks (CNNs) attain state-of-the-art...
www.biorxiv.org
April 14, 2025 at 7:17 PM