Oof, the sycophancy problem in LLM's + triggering on any irrelevant details you feed them, recently led a P2 problem call down the wrong pathing for hours.
The chatbot is never going to TELL you to step back and ask if this entire inquiry is irrelevant to larger goal.
This is your moat. It's mine.
August 29, 2025 at 2:59 AM
Oof, the sycophancy problem in LLM's + triggering on any irrelevant details you feed them, recently led a P2 problem call down the wrong pathing for hours.
The chatbot is never going to TELL you to step back and ask if this entire inquiry is irrelevant to larger goal.