ICLR Conference
@iclr-conf.bsky.social
International Conference on Learning Representations https://iclr.cc/
Runner Up
Neural Machine Translation by Jointly Learning to Align and Translate
Dzmitry Bahdanau, Kyunghyun Cho, Yoshua Bengio
Introducing a form of attention, the architecture has become a cornerstone of modern deep learning, providing a foundation for transformers and LLMs.
Neural Machine Translation by Jointly Learning to Align and Translate
Dzmitry Bahdanau, Kyunghyun Cho, Yoshua Bengio
Introducing a form of attention, the architecture has become a cornerstone of modern deep learning, providing a foundation for transformers and LLMs.
April 15, 2025 at 3:39 AM
Runner Up
Neural Machine Translation by Jointly Learning to Align and Translate
Dzmitry Bahdanau, Kyunghyun Cho, Yoshua Bengio
Introducing a form of attention, the architecture has become a cornerstone of modern deep learning, providing a foundation for transformers and LLMs.
Neural Machine Translation by Jointly Learning to Align and Translate
Dzmitry Bahdanau, Kyunghyun Cho, Yoshua Bengio
Introducing a form of attention, the architecture has become a cornerstone of modern deep learning, providing a foundation for transformers and LLMs.
Test of Time Winner
Adam: A Method for Stochastic Optimization
Diederik P. Kingma, Jimmy Ba
Adam revolutionized neural network training, enabling significantly faster convergence and more stable training across a wide variety of architectures and tasks.
Adam: A Method for Stochastic Optimization
Diederik P. Kingma, Jimmy Ba
Adam revolutionized neural network training, enabling significantly faster convergence and more stable training across a wide variety of architectures and tasks.
April 15, 2025 at 3:39 AM
Test of Time Winner
Adam: A Method for Stochastic Optimization
Diederik P. Kingma, Jimmy Ba
Adam revolutionized neural network training, enabling significantly faster convergence and more stable training across a wide variety of architectures and tasks.
Adam: A Method for Stochastic Optimization
Diederik P. Kingma, Jimmy Ba
Adam revolutionized neural network training, enabling significantly faster convergence and more stable training across a wide variety of architectures and tasks.