human behavior/reinforcement learning/decision-making/computational modeling
As AI enters legal, medical, and personal decision-making, how people perceive its moral reasoning must be understood.
As AI enters legal, medical, and personal decision-making, how people perceive its moral reasoning must be understood.
Terms tied to cost-benefit logic (“save,” “lives”) often triggered disagreement, especially in personal moral dilemmas.
So: detection relied on surface cues, but judgment aligned with meaning.
Terms tied to cost-benefit logic (“save,” “lives”) often triggered disagreement, especially in personal moral dilemmas.
So: detection relied on surface cues, but judgment aligned with meaning.
But they had little to no effect on agreement.
People spotted the machine from the style, not the substance.
But they had little to no effect on agreement.
People spotted the machine from the style, not the substance.
This reduced detection—but not the belief-based bias.
People still agreed with content they believed was human, even when it wasn’t.
This reduced detection—but not the belief-based bias.
People still agreed with content they believed was human, even when it wasn’t.
In complex moral dilemmas (personal moral), participants preferred AI-generated justifications—but only when they didn’t know they came from AI.
When they thought a justification was from AI, they agreed less.
So: pro-AI content, anti-AI belief.
In complex moral dilemmas (personal moral), participants preferred AI-generated justifications—but only when they didn’t know they came from AI.
When they thought a justification was from AI, they agreed less.
So: pro-AI content, anti-AI belief.
People could spot AI-generated moral justifications better than chance—especially in morally difficult scenarios.
Still, accuracy stayed below 70%, and many AI responses passed as human.
People could spot AI-generated moral justifications better than chance—especially in morally difficult scenarios.
Still, accuracy stayed below 70%, and many AI responses passed as human.