Théo Gnassounou
tgnassou.bsky.social
Théo Gnassounou
@tgnassou.bsky.social
Ph.D. student in Machine Learning and Domain Adaptation for Neuroscience at Inria Saclay/ Mind.
Website: https://tgnassou.github.io/
Skada: https://scikit-adaptation.github.io/
📩 Message me if you’d like to participate!

Skada: github.com/scikit-adapt...
GitHub - scikit-adaptation/skada: Domain adaptation toolbox compatible with scikit-learn and pytorch
Domain adaptation toolbox compatible with scikit-learn and pytorch - scikit-adaptation/skada
github.com
May 20, 2025 at 9:30 AM
🎯 Goal of the Skada Coding Sprint
- Improve Skada: add new methods, improve documentation, fix bugs ...
- Contribute to open source in a welcoming environment
- Collaborate with a community of ML researchers and developers
- Implement and test your own Domain Adaptation methods
- Have a lot of fun!
May 20, 2025 at 9:30 AM
💡 To make it easier for everyone to apply these techniques, we built Skada: a simple, Python-based library for Domain Adaptation.

Skada team organizes a coding sprint :(@rflamary.bsky.social , @antoinecollas.bsky.social , @ambroiseodt.bsky.social)

📍 When: June 24–25
📍 Where: Inria Saclay
May 20, 2025 at 9:30 AM
SKADA-Bench is built on SKADA: github.com/scikit-adapt...

This work results from a collaboration with Yanis Lalou, @antoinecollas.bsky.social , Antoine de Mathelin, Oleksii Kachaiev, @ambroiseodt.bsky.social , Alexandre Gramfort, Thomas Moreau and @rflamary.bsky.social !
LinkedIn
This link will take you to a page that’s not on LinkedIn
lnkd.in
February 12, 2025 at 3:17 PM
The benchmark shows deep DA methods struggle beyond computer vision, highlighting their limits on other modalities!
February 12, 2025 at 3:17 PM
The results show the benefit of DA in some cases but parameter-sensitive shallow methods struggle to adapt to new domains. Better to use low-parameter methods like LinOT & Coral!
February 12, 2025 at 3:17 PM
This benchmark is done using a realistic scenario comprising the validation of hyperparameters using nested loop and DA scorers!
February 12, 2025 at 3:17 PM
🔬 What’s inside?
• Multi-Modality Benchmark: 4 simulated + 8 real datasets
• 20 Shallow DA Methods: Reweighting, mapping, subspace alignment & others
• 7 Deep DA Methods: CAN, MCC, MDD, SPA & more
• 7 Unsupervised Validation Scorers
February 12, 2025 at 3:17 PM
DA adapts machine learning models to distribution shifts between training and test sets. We propose SKADA-Bench, the first comprehensive, reproducible benchmark that evaluates DA methods across multiple modalities: computer vision, natural language processing, tabular, and biomedical data.
February 12, 2025 at 3:17 PM
This library is a team effort: @antoinecollas.bsky.social, Oleksii Kachaiev, @rflamary.bsky.social , Yanis Lalou, Antoine de Mathelin, Ruben Bueno, Apolline Mellot, @ambroiseodt.bsky.social , Alexandre Gramfort and myself!
December 6, 2024 at 3:50 PM
🔀 Improved Subsampling Tools
- Added StratifiedDomainSubsampler and DomainSubsampler to handle large datasets effortlessly.

🤖 Deep Model Enhancements
- Smarter batch handling.
- Many bug fixes.

📖 Documentation Upgrades
- Contributor Guide: Join the development of Skada!
- New Logo!!
December 6, 2024 at 3:50 PM
📊 Advanced Scorers
- New MixValScorer for mixup validation.
- Enhanced scorer compatibility with deep models.
December 6, 2024 at 3:50 PM
💡New Deep Domain Adaptation Methods: CAN, SPA, MCC, and MDD.These methods combine the cross entropy loss on the source domain with domain aware losses (graph based, adversarial, class confusion, …).
December 6, 2024 at 3:50 PM
💡New Shallow Domain Adaptation Methods: MongeAlignment and JCPOT for linear multi-source domain adaptation with optimal transport.
December 6, 2024 at 3:50 PM