Boosting the Capabilities of Compact Models in Low-Data Contexts with Large Language Models and Retrieval-Augmented Generation
https://arxiv.org/abs/2410.00387
Boosting the Capabilities of Compact Models in Low-Data Contexts with Large Language Models and Retrieval-Augmented Generation
https://arxiv.org/abs/2410.00387
From Priest to Doctor: Domain Adaptaion for Low-Resource Neural Machine Translation
https://arxiv.org/abs/2412.00966
From Priest to Doctor: Domain Adaptaion for Low-Resource Neural Machine Translation
https://arxiv.org/abs/2412.00966
🗺️GlossLM: A Massively Multilingual Corpus and Pretrained Model for Interlinear Glossed Text [EMNLP Main]
🗺️GlossLM: A Massively Multilingual Corpus and Pretrained Model for Interlinear Glossed Text [EMNLP Main]