Maharshi Gor
banner
maharshigor.bsky.social
Maharshi Gor
@maharshigor.bsky.social
PhD student @ Univ of Maryland
NLP, Question Answering, Human AI, LLMs
More at mgor.info
🏆ADVSCORE won an Outstanding Paper Award at #NAACL2025

🚨 Don't miss out on our poster presentation *today at 2 pm* by Yoo Yeon (first author).

📍Poster Session 5 - HC: Human-centered NLP

💼 Highly recommend talking to her if you are hiring and/or interested in Human-focused Al dev and evals!
May 1, 2025 at 12:38 PM
Reposted by Maharshi Gor
🚨 New Position Paper 🚨

Multiple choice evals for LLMs are simple and popular, but we know they are awful 😬

We complain they're full of errors, saturated, and test nothing meaningful, so why do we still use them? 🫠

Here's why MCQA evals are broken, and how to fix them 🧵
February 24, 2025 at 9:04 PM
Reposted by Maharshi Gor
The Impact of Explanations on Fairness in Human-AI Decision-Making: Protected vs Proxy Features

Despite hopes that explanations improve fairness, we see that when biases are hidden behind proxy features, explanations may not help.

Navita Goyal, Connor Baumler +al IUI’24
hal3.name/docs/daume23...
>
December 9, 2024 at 11:41 AM
Reposted by Maharshi Gor
Do great minds think alike? Investigating Human-AI Complementarity in QA

We use item response theory to compare the capabilities of 155 people vs 70 chatbots at answering questions, teasing apart complementarities; implications for design.

by Maharshi Gor +al EMNLP’24
hal3.name/docs/daume24...
>
December 12, 2024 at 10:41 AM
Reposted by Maharshi Gor
💯

Hallucination is totally the wrong word, implying it is perceiving the world incorrectly.

But it's generating false, plausible sounding statements. Confabulation is literally the perfect word.

So, let's all please start referring to any junk that an LLM makes up as "confabulations".
petition to change the word describing ChatGPT's mistakes from 'hallucinations' to 'confabulations'

A hallucination is a false subjective sensory experience. ChatGPT doesn't have experiences!

It's just making up plausible-sounding bs, covering knowledge gaps. That's confabulation
December 11, 2024 at 2:47 PM
Reposted by Maharshi Gor
starter pack for the Computational Linguistics and Information Processing group at the University of Maryland - get all your NLP and data science here!

go.bsky.app/V9qWjEi
December 10, 2024 at 5:14 PM