Yilei Tu
yileitu.bsky.social
Yilei Tu
@yileitu.bsky.social
🧑‍💻Multilingual/Multicultural NLP🗺️| MSc @ethzurich.bsky.social🇨🇭; BSc CUHKSZ🇨🇳
Interestingly, we find that simply introducing another non-English language (not necessarily in the demonstration) helps📈, by non-English prepending context-irrelevant sentences. We show that strategic multilingual exposure bridges the gap for underrepresented languages💡.
March 6, 2025 at 3:31 PM
✅ Demos in mixed high-resource languages are more robust and effective than individual high-resource languages, outperforming English-only demos.
✅ This phenomenon is generalizable across models (Llama3, GPT-4o) & tasks (math, commonsense, verbal).
March 6, 2025 at 3:31 PM
Key Takeaways
✅ In-context Demonstrations in non-Latin high-resource languages (e.g., Chinese, Japanese) improves LLM performance across low-resource languages, more effectively than English-only demos.
March 6, 2025 at 3:31 PM
How can we bridge LLMs' multilingual performance for low-resource languages by prompting strategies 🌍? Our latest work (w/ @fredashi.bsky.social and Andrew Xue @uwaterloo.ca) systematically analyzes various strategies of Multilingual In-Context Learning.
March 6, 2025 at 3:31 PM