Young People Turning to ChatGPT for Mental Support: A Growing Trend with Cautionary Warnings

ChatGPT, the artificial intelligence chatbot widely used for professional and educational assistance, is increasingly being turned to by young people for emotional support — a trend that experts say is both understandable and alarming.

Sara, a 27-year-old professional who requested her name be changed, initially used ChatGPT for work-related help like fact-checking and idea generation. But over time, it became a source of emotional reassurance. “I began using it during emotionally tough situations at work, in the family, and even in relationships,” she said. “It analyses everything like it reads my mind. I love the reassurance and the validation it gives me.”

What started as a productivity tool slowly became a 24/7 confidant for Sara, especially during moments of anxiety or panic. “It became like a coach, helping me understand myself better,” she said. “Therapy is something I’d love to try one day, but it’s expensive. ChatGPT is private, discreet, and always there when I need it.”

Mental health professionals say this shift is no surprise. Dr Alexandre Machado, a clinical neuropsychologist at Hakkini mental health clinic in Dubai, acknowledged the appeal. “It’s easy, anonymous, and always available — like having a friend in your pocket,” he said.

But the growing dependence on AI for emotional support is raising serious concerns. Dr Waleed Alomar, specialist psychiatrist at Medcare Royal Speciality Hospital in Al Qusais, warned that AI tools lack the ability to distinguish between everyday emotional struggles and serious mental health conditions. “Many users — especially young people — may begin to feel like they’re speaking to a real person or even a licensed professional,” he said.

“This can be dangerous because AI is not equipped to diagnose or guide users to professional help when it’s needed. While it might offer short-term relief, it could also delay necessary mental health treatment,” he added.

Experts cited real-world cases that highlight the risks. Dr Machado pointed to instances where individuals, including a man in Belgium and a teenager in the UK, were reportedly influenced by AI platforms in harmful ways. “These are extreme cases, but they show the potential dangers of unregulated reliance on AI for emotional guidance,” he said.

Still, both experts agree that AI can play a supportive role. “For late-night anxiety or when someone needs an anonymous space to vent, AI tools offer an accessible and non-judgmental option,” said Dr Alomar. “But they should not replace human care.”

The key, they say, is balance. “AI can be a valuable supplement, but not a substitute,” said Dr Machado. “Use it wisely — and know when it’s time to seek help from a real person.”