👩🎓 PhD student in Computational Linguistics @ Heidelberg University |
Impressum: https://t1p.de/q93um
☕️ We go over EBMs and then dive into the Energy-Based Transformers paper to make LLMs that refine guesses, self-verify, and could adapt compute to problem difficulty.
☕️ We go over EBMs and then dive into the Energy-Based Transformers paper to make LLMs that refine guesses, self-verify, and could adapt compute to problem difficulty.
🎥 Watch: youtu.be/GBISWggsQOA
🎥 Watch: youtu.be/GBISWggsQOA
💡 GoFundMe: gofund.me/453ed662
💡 GoFundMe: gofund.me/453ed662
#ACL2025NLP
#ACL2025NLP
I’m always up for a chat about reasoning models, NLE faithfulness, synthetic data generation, or the joys and challenges of explaining AI on YouTube.
If you're around, let’s connect!
I’m always up for a chat about reasoning models, NLE faithfulness, synthetic data generation, or the joys and challenges of explaining AI on YouTube.
If you're around, let’s connect!
aicoffeebreakwl.substack.com
We'll be adding more posts regularly, stay tuned! 📻
aicoffeebreakwl.substack.com
We'll be adding more posts regularly, stay tuned! 📻
Listen in by joining Roosh Circle 's No Papers Club. There will be lots of insights, inspiration, and practical advice. 🫱🫲
Join here: lnkd.in/e_wX2r_G
⌚ Aug 1 at 5:00 PM CEST!
Listen in by joining Roosh Circle 's No Papers Club. There will be lots of insights, inspiration, and practical advice. 🫱🫲
Join here: lnkd.in/e_wX2r_G
⌚ Aug 1 at 5:00 PM CEST!