Jean-Philip Piquemal
banner
jppiquem.bsky.social
Jean-Philip Piquemal
@jppiquem.bsky.social
Professor of Theoretical Chemistry @sorbonne-universite.fr & Director @lct-umr7616.bsky.social| Co-Founder & CSO @qubit-pharma.bsky.social | (My Views)
https://piquemalresearch.com | https://tinker-hp.org
Pinned
#compchem #machinelearning
1st of the year in J. Phys. Chem. Lett.: "Accelerating Molecular Dynamics Simulations with Foundation Neural Network Models using Multiple Time-Step and Distillation". pubs.acs.org/doi/full/10....
(see also the updated preprint: arxiv.org/abs/2510.06562)
Accelerating Molecular Dynamics Simulations with Foundation Neural Network Models Using Multiple Time Steps and Distillation
We present a distilled multi-time-step (DMTS) strategy to accelerate molecular dynamics simulations using foundation neural network models. DMTS uses a dual-level neural network, where the target accurate potential is coupled to a simpler but faster model obtained via a distillation process. The 3.5 Å cutoff distilled model is sufficient to capture the fast-varying forces, i.e., mainly bonded interactions, from the accurate potential, allowing its use in a reversible reference system propagator algorithm (RESPA)-like formalism. The approach conserves accuracy, preserving both static and dynamic properties, while enabling us to evaluate the costly model only every 3 to 6 fs depending on the system. Consequently, large simulation speedups over standard 1 fs integration are observed: nearly 4-fold in homogeneous systems and 3-fold in large solvated proteins through leveraging active learning for enhanced stability. Such a strategy is applicable to any neural network potential and reduces the performance gap with classical force fields.
pubs.acs.org
#compchem #compchemsky Our paper in J. Phys. Chem. Lett.: "Accelerating Molecular Dynamics Simulations with Foundation Neural Network Models using Multiple Time-Step and Distillation" made it to one of the covers! pubs.acs.org/doi/full/10....
February 5, 2026 at 10:58 AM
#quantumcomputing Good read: Quantum computers will finally be useful: what’s behind the revolution
www.nature.com/articles/d41...
Quantum computers will finally be useful: what’s behind the revolution
A string of surprising advances suggests usable quantum computers could be here in a decade.
www.nature.com
February 5, 2026 at 8:53 AM
Reposted by Jean-Philip Piquemal
🚀 Game-changing speed for drug discovery simulations without trading accuracy for Relative Binding Free Energy (RBFE) calculations.
Dual-LAO delivers 15–30× faster simulations while maintaining industry-leading accuracy (~0.5–0.6 kcal/mol). #compchem
t.co/dDLVqXKvZm
January 29, 2026 at 4:20 PM
Reposted by Jean-Philip Piquemal
🚀First paper published!
We introduce DMTS, a multi-time-step method for ML force fields
✔️×4 speed-up
✔️Accuracy preserved
✔️Generalizable to any ML potential
📄Link: pubs.acs.org/doi/full/10....
The preprint: arxiv.org/abs/2510.06562
@jppiquem.bsky.social
#MolecularDynamics #MachineLearning
January 28, 2026 at 5:59 PM
Reposted by Jean-Philip Piquemal
🤩 New year, new publication using the FeNNix-Bio1 foundation model !

🚀« Accelerating Molecular Dynamics Simulations with Foundation Neural Network Models using Multiple Time-Step and Distillation» published in the Journal of Physical Chemistry Letters
#compchemsky #biosky #machinelearning
#compchem #machinelearning
1st of the year in J. Phys. Chem. Lett.: "Accelerating Molecular Dynamics Simulations with Foundation Neural Network Models using Multiple Time-Step and Distillation". pubs.acs.org/doi/full/10....
(see also the updated preprint: arxiv.org/abs/2510.06562)
Accelerating Molecular Dynamics Simulations with Foundation Neural Network Models Using Multiple Time Steps and Distillation
We present a distilled multi-time-step (DMTS) strategy to accelerate molecular dynamics simulations using foundation neural network models. DMTS uses a dual-level neural network, where the target accurate potential is coupled to a simpler but faster model obtained via a distillation process. The 3.5 Å cutoff distilled model is sufficient to capture the fast-varying forces, i.e., mainly bonded interactions, from the accurate potential, allowing its use in a reversible reference system propagator algorithm (RESPA)-like formalism. The approach conserves accuracy, preserving both static and dynamic properties, while enabling us to evaluate the costly model only every 3 to 6 fs depending on the system. Consequently, large simulation speedups over standard 1 fs integration are observed: nearly 4-fold in homogeneous systems and 3-fold in large solvated proteins through leveraging active learning for enhanced stability. Such a strategy is applicable to any neural network potential and reduces the performance gap with classical force fields.
pubs.acs.org
January 24, 2026 at 6:39 AM
#compchem #machinelearning
1st of the year in J. Phys. Chem. Lett.: "Accelerating Molecular Dynamics Simulations with Foundation Neural Network Models using Multiple Time-Step and Distillation". pubs.acs.org/doi/full/10....
(see also the updated preprint: arxiv.org/abs/2510.06562)
Accelerating Molecular Dynamics Simulations with Foundation Neural Network Models Using Multiple Time Steps and Distillation
We present a distilled multi-time-step (DMTS) strategy to accelerate molecular dynamics simulations using foundation neural network models. DMTS uses a dual-level neural network, where the target accurate potential is coupled to a simpler but faster model obtained via a distillation process. The 3.5 Å cutoff distilled model is sufficient to capture the fast-varying forces, i.e., mainly bonded interactions, from the accurate potential, allowing its use in a reversible reference system propagator algorithm (RESPA)-like formalism. The approach conserves accuracy, preserving both static and dynamic properties, while enabling us to evaluate the costly model only every 3 to 6 fs depending on the system. Consequently, large simulation speedups over standard 1 fs integration are observed: nearly 4-fold in homogeneous systems and 3-fold in large solvated proteins through leveraging active learning for enhanced stability. Such a strategy is applicable to any neural network potential and reduces the performance gap with classical force fields.
pubs.acs.org
January 21, 2026 at 12:06 PM
Cheers to 2026! Happy new year everyone.
December 31, 2025 at 3:35 PM
#hpc #supercomputing #machinelearning #compchem
New Grand Challenges @gencifrance.bsky.social report dedicated to the Jean Zay 4 machine at IDRIS. Our work on the FeNNix-Bio1 machine learning foundation model can be found on pages 22-25.
genci.fr/sites/defaul...
December 28, 2025 at 7:51 AM
#compchem #compbio Good read: Fast Parametrization of Martini3 Models for Fragments and Small Molecules pubs.acs.org/doi/10.1021/...
Fast Parametrization of Martini3 Models for Fragments and Small Molecules
Coarse-grained molecular dynamics simulations, such as those performed with the recently parametrized Martini 3 force field, simplify molecular models and enable the study of larger systems over longer time scales. With this new implementation, Martini 3 allows more bead types and sizes, becoming more amenable to studying dynamical phenomena involving small molecules such as protein–ligand interactions and membrane permeation. However, while solutions existed to automatically model small molecules using the previous iteration of the Martini force field, there is no simple way to generate such molecules for Martini 3 yet. Here, we introduce Auto-MartiniM3, an advanced and updated version of the Auto-Martini program designed to automate the coarse-graining of small molecules to be used with the Martini 3 force field. We validated our approach by modeling 81 simple molecules from the Martini Database and comparing their structural and thermodynamic properties with those obtained from models designed by Martini experts. Additionally, we assessed the behavior of Auto-MartiniM3-generated models by calculating solute translocation and free energy across lipid bilayers. We also evaluated more complex molecules such as caffeine by testing its binding to the adenosine A2A receptor. Finally, our results from deploying Auto-MartiniM3 on a large data set of molecular fragments demonstrate that this program can become a tool of choice for fast, high-throughput creation of coarse-grained models of small molecules, offering a good balance between automation and accuracy. Auto-MartiniM3 source code is freely available at https://github.com/Martini-Force-Field-Initiative/Automartini_M3.
pubs.acs.org
December 27, 2025 at 10:19 AM
#compchem Good read: Automated Machine Learning Pipeline: Large Language Models-Assisted Automated Data set Generation for Training Machine-Learned Interatomic Potentials pubs.acs.org/doi/10.1021/...
Automated Machine Learning Pipeline: Large Language Models-Assisted Automated Data set Generation for Training Machine-Learned Interatomic Potentials
Machine learning interatomic potentials (MLIPs) have become powerful tools to extend molecular simulations beyond the limits of quantum methods, offering near-quantum accuracy at much lower computatio...
pubs.acs.org
December 27, 2025 at 10:18 AM
Merry Christmas!!!
December 24, 2025 at 6:32 AM
#compchem #compbio Last preprint of the year: "Fast, systematic and robust relative binding free energies for simple and complex transformations : dual-LAO".
arxiv.org/abs/2512.17624
Great work by N. Ansari. @qubit-pharma.bsky.social .
Another nice collab with J. Hénin.
December 23, 2025 at 6:24 AM
Reposted by Jean-Philip Piquemal
Wishing you happy holidays. See you in 2026!!! #compchem
December 19, 2025 at 11:07 AM
Also, if you check the Github, FeNNol can also launch MACE, MACE-OFF and ANI simulations. Enjoy! #compchem
💫 We just released the weights of the #FeNNixBio1 foundation machine learning model for drug design! 💫

Weights: github.com/FeNNol-tools...
FeNNol code: github.com/FeNNol-tools...
The models are distributed under the open source ASL license (non-commercial academic research). #compchem #compbio
GitHub - FeNNol-tools/FeNNol-PMC: FeNNol Pretrained Models Collection
FeNNol Pretrained Models Collection. Contribute to FeNNol-tools/FeNNol-PMC development by creating an account on GitHub.
github.com
December 18, 2025 at 10:56 AM
💫 We just released the weights of the #FeNNixBio1 foundation machine learning model for drug design! 💫

Weights: github.com/FeNNol-tools...
FeNNol code: github.com/FeNNol-tools...
The models are distributed under the open source ASL license (non-commercial academic research). #compchem #compbio
GitHub - FeNNol-tools/FeNNol-PMC: FeNNol Pretrained Models Collection
FeNNol Pretrained Models Collection. Contribute to FeNNol-tools/FeNNol-PMC development by creating an account on GitHub.
github.com
December 17, 2025 at 1:58 PM
Reposted by Jean-Philip Piquemal
"C'est un vrai cri d'alarme" : de Lille à Angers, des universités alertent sur leur situation financière
www.franceinfo.fr/societe/educ...
@afp.com @franceinfo.fr
"C'est un vrai cri d'alarme" : de Lille à Angers, des universités alertent sur leur situation financière
L'an dernier, les présidents d'universités étaient déjà montés au créneau en décembre pour dénoncer les restrictions budgétaires demandées par le gouvernement, après déjà plusieurs années de sous-fina...
www.franceinfo.fr
December 12, 2025 at 8:46 AM
Reposted by Jean-Philip Piquemal
Congrats to Cesar Féniou who succesfully defended his PhD: "Quantum algorithms for first-principles quantum chemistry"
@qubit-pharma.bsky.social

#compchem #quantumcomputing
December 12, 2025 at 6:47 AM
Reposted by Jean-Philip Piquemal
New paper in collaboration with Q-CTRL demonstrating the use of NISQ hardware for the water placement problem in drug design, up to 123 qubits on IBM's Heron QPU! #quantumcomputing #compchem #drugdesign
#compchem #quantumcomputing
I’m thrilled to share this new preprint: "Practical protein-pocket hydration-site prediction for drug discovery on a quantum computer".
👉Check it out: arxiv.org/abs/2512.08390

Great collab with D. Loco (@qubit-pharma.bsky.social ), K. Barkemeyer & A. Carvalho (Q-CTRL)
December 11, 2025 at 5:47 AM
#compchem #quantumcomputing
I’m thrilled to share this new preprint: "Practical protein-pocket hydration-site prediction for drug discovery on a quantum computer".
👉Check it out: arxiv.org/abs/2512.08390

Great collab with D. Loco (@qubit-pharma.bsky.social ), K. Barkemeyer & A. Carvalho (Q-CTRL)
December 10, 2025 at 5:51 AM
Quantum Zeitgeist @superposition.bsky.social highlighted our recent preprint: "An Optimal Framework for Constructing Lie-Algebra Generator Pools: Application to Variational Quantum Eigensolvers for Chemistry" #quantumcomputing #compchem
quantumzeitgeist.com/variational-...
Optimal Framework Constructs Lie-Algebra Generator Pools, Enabling Efficient Variational Quantum Eigensolvers For Chemistry
Researchers have developed a new mathematical strategy that efficiently identifies the essential building blocks of complex systems, dramatically improving computational power for applications ranging...
quantumzeitgeist.com
December 8, 2025 at 12:06 PM
Reposted by Jean-Philip Piquemal
Over the past century, quantum mechanics has served as the foundation for modern physics theories, including quantum field theory. The theoretical groundwork is now being transformed into new disciplines.

Learn more in a new special issue of Science: https://scim.ag/4oz1y2a
December 4, 2025 at 7:05 PM
Ce lundi 8/12, je représenterai @qubit-pharma.bsky.social à la journée "𝐐𝐮𝐚𝐧𝐭𝐮𝐦 & 𝐈𝐧𝐭𝐞𝐥𝐥𝐢𝐠𝐞𝐧𝐜𝐞 𝐀𝐫𝐭𝐢𝐟𝐢𝐜𝐢𝐞𝐥𝐥𝐞 - 𝐕𝐞𝐫𝐬 𝐮𝐧𝐞 𝐜𝐨𝐧𝐯𝐞𝐫𝐠𝐞𝐧𝐜𝐞 𝐝𝐞𝐬 𝐫𝐮𝐩𝐭𝐮𝐫𝐞𝐬 𝐭𝐞𝐜𝐡𝐧𝐨𝐥𝐨𝐠𝐢𝐪𝐮𝐞𝐬 ?".
evenium.events/quantum-inte...

#quantumcomputing #AI #artificialintelligence #machinelearning
December 6, 2025 at 9:58 AM
Reposted by Jean-Philip Piquemal
#quantumcomputing #compchem
New preprint! The presented mathematical framework is general & applicable well beyond chemistry in fields including quantum error correction, quantum control, quantum machine learning, and more universally wherever compact Pauli basis are required. Congrats to the team!
#compchem #compchemsky #quantumcomputing
New group preprint: "An Optimal Framework for Constructing Lie-Algebra Generator Pools: Application to Variational Quantum Eigensolvers for Chemistry."
👉Check it out: arxiv.org/abs/2511.22593
@piquemalgroup.bsky.social @qubit-pharma.bsky.social
December 2, 2025 at 6:33 AM
#compchem #compchemsky #quantumcomputing
New group preprint: "An Optimal Framework for Constructing Lie-Algebra Generator Pools: Application to Variational Quantum Eigensolvers for Chemistry."
👉Check it out: arxiv.org/abs/2511.22593
@piquemalgroup.bsky.social @qubit-pharma.bsky.social
December 2, 2025 at 6:31 AM