Head of the InClow research group: https://inclow-lm.github.io/
We show that NN simulations can help us unravel these complex processes, next to human experiments & corpus studies
See @yuqing0304.bsky.social’s thread below ⬇️
We show that NN simulations can help us unravel these complex processes, next to human experiments & corpus studies
See @yuqing0304.bsky.social’s thread below ⬇️
- child-directed language & syntax learning in LMs (@frap98.bsky.social)
- Turkish benchmark of grammatical minimal pairs (@ezgibasar.bsky.social) & a massively multilingual one, MultiBLiMP (@jumelet.bsky.social)
...and more!
- child-directed language & syntax learning in LMs (@frap98.bsky.social)
- Turkish benchmark of grammatical minimal pairs (@ezgibasar.bsky.social) & a massively multilingual one, MultiBLiMP (@jumelet.bsky.social)
...and more!
- MT error prediction techniques & its reception by professional translators (@gsarti.com)
- thinking language in Large Reasoning Models (@jiruiqi.bsky.social)
- effect of stereotypes on LLM’s implicit personalization (@veraneplenbroek.bsky.social)
....
- MT error prediction techniques & its reception by professional translators (@gsarti.com)
- thinking language in Large Reasoning Models (@jiruiqi.bsky.social)
- effect of stereotypes on LLM’s implicit personalization (@veraneplenbroek.bsky.social)
....
See paper for results with 13 LLMs, including mono- and multilingual models of different sizes!
See paper for results with 13 LLMs, including mono- and multilingual models of different sizes!
To study LLMs' robustness to these properties, we create experimental paradigms testing syntactic skills w/ different word orders & subordination strategies:
To study LLMs' robustness to these properties, we create experimental paradigms testing syntactic skills w/ different word orders & subordination strategies:
I'm particularly happy to contribute to this for a language I spent years learning and still found fascinating!
I'm particularly happy to contribute to this for a language I spent years learning and still found fascinating!