Joachim W Pedersen
joachimwpedersen.bsky.social
Joachim W Pedersen
@joachimwpedersen.bsky.social
Bio-inspired AI, meta-learning, evolution, self-organization, developmental algorithms, and structural flexibility.
Postdoc @ ITU of Copenhagen.
https://scholar.google.com/citations?user=QVN3iv8AAAAJ&hl=en
Very satisfying to see one's code run on actual real-world robots and not just simulation.
Check out the paper here:
arxiv.org/pdf/2503.12406
arxiv.org
March 19, 2025 at 10:25 AM
Both 4-pagers of early research as well as 8-page papers with more substantial results are welcome!
February 10, 2025 at 1:46 PM
Very cool! And great aesthetics as well 🙌 😊
January 21, 2025 at 6:22 PM
3) Optimizer optimization: Think hyperparameter tuning, e.g., learning rate etc. The search within the inner-loop is altered.

We use meta-learning to achieve improved inner-loop optimization, so it is well worth considering exactly how our double-loop achieves this!
#meta-learning #deeplearning #ai
December 4, 2024 at 3:21 AM
1) Starting point optimization: Think MAML. Move the initial point of the inner-loop search to a better place to learn quick.
2) Loss landscape optimization: Think neural architecture search. The loss landscape(s) of the inner-loop is transformed.
December 4, 2024 at 3:21 AM
This can be thought of independently from which optimizer is being used in the inner-loop.
In any meta-learning approach, the outer-loop optimization will transform the inner-loop optimization process in at least of one three ways and often in a combination of these three.
December 4, 2024 at 3:21 AM