- Published on
AI chatbots that always agree can feel supportive, but they risk reinforcing self-doubt, overthinking, and emotional dependence especially in vulnerable people. They often miss warning signs like crisis or delusional thinking. Safer designs need more grounding, challenge, memory, and human oversight. Use AI as companion, not counselor.