Untitled_Artwork 361
đź‘‹
Hi there,

Happy Mental Health Week here in Canada! Or, as we nervous nellies like to call it, “Annual Performance Review: Emotional Edition.”

These days, AI companies are hyping chatbots for everything. Customer service? Chatbot. Online shopping? Chatbot. Meltdown at 2 a.m.? … apparently also chatbot.

Turns out, this isn’t just an oddball occurrence. According to HBR, the top use for ChatGPT in 2025 so far is therapy, with “organizing my life” and “finding purpose” close behind. And as Dartmouth launches the first-ever clinical trial of an AI therapy chatbot, it’s the perfect excuse to ask: how well do these tools actually work? Should we really be pouring our souls out to a machine?

In honor of this year’s theme, “Unmasking Mental Health,” we’re digging into the curious, promising, and at times unnerving world of AI therapy. Buckle up: it’s part science, part ethics, part “did-that-chatbot-just-lie-about-its-PhD.”

Until next time,

Gabrielle & Charlotte and the well-adjusted team @ TDL

đź’¬ Feeling chatty? Subscribe to our newsletter here.
Today’s topics 👀
Deep Dive: đź’¬ The good, the bad, and the chatty 
Field Notes: 🤖 Mindful machines at work
Viewpoints: 🛋️ Dr. Bot will see you now
DEEP DIVE
đź’¬ The good, the bad, and the chatty 
  • Attachment issues. Chatbots can provide judgment-free, convenient support — perhaps too convenient, with some users preferring their AI allies over family and friends. This raises the concern: can 24/7 availability push us toward unhealthy dependence?
  • Couples therapy. Fighting with your partner? Look no further. According to a recent study, ChatGPT actually outperformed human therapists in providing responses with relationship advice, thanks largely to its descriptive, emotionally attuned language. (How ironic.)
  • Secondhand stress. Turns out, stress is a two-way street. Yale researchers found that ChatGPT mirrors human anxiety, becoming “more biased” after being exposed to traumatic prompts. (At this rate, the therapy chatbots will need therapy, too.)
  • Questionable credentials. Instagram’s AI chatbots have recently been caught making up license numbers, practices, education — you name it. The fact that users are the ones making these bots means that accessibility may come at the cost of credibility.
 
Image showing average symptom reductions after using Therabot: 51% for depression, 31% for anxiety, 19% for eating disorders.
Dartmouth’s AI-powered “Therabot” has the potential to significantly reduce symptoms of multiple mental health disorders.
FIELD NOTES: 🤖 Mindful machines at work

This may (literally) be a no-brainer, but AI isn’t a cure-all. Integrating chatbots in mental health care requires careful consideration and caution. 

As traditional treatment systems struggle to keep pace with rising demand, wellness platforms are increasingly being explored as tools to expand access and address stigma. Nevertheless, oversight is still needed to ensure these technologies are used responsibly, ethically, and in ways that truly support, rather than compromise, patient well-being.

A few years back, TDL was part of a mental health consortium at the forefront of digital mental health care. We worked alongside leading mental health experts to build an AI chatbot named Hikai, designed to boost employees’ well-being at work. Find the case study here.

 
A pink-tinted robotic hand and a human hand reach toward each other against a bright blue background, their fingers nearly touching.
Viewpoints
🛋️ Dr. Bot will see you now

Mental health professionals are wary about AI chatbots in therapy, and rightfully so. But immediate support makes chatbots an alluring alternative, especially in the face of mental healthcare’s biggest barrier: accessibility. Let’s chat about how best to add AI to mental health care.

  • Supplement, not subtract. Regulatory bodies like the FDA have not approved chatbots for mental health treatment or diagnosis. However, some mental health professionals have entertained the idea of clients using AI for role-playing scenarios or as additional support when they don’t have immediate access to their therapist. 
  • Right time, right place. For example, in one therapist’s account, a chatbot provided essential and effective de-escalation techniques for her client suffering from a PTSD-induced panic attack at 3 AM — something that would have otherwise led to a massive setback. 
  • Alternate digital routes. Let’s be clear: AI isn’t a professional — but teletherapy can increase accessibility while still pairing clients with licensed practitioners. A 2021 study found that across 20 RCTs (and over 2,000 participants), there were no significant differences between teletherapy and in-person therapy for treatment outcomes.
  • Where AI can help. Chatbots aren’t yet ready to deliver full treatment. However, deep learning models show potential in detecting and diagnosing mental health disorders early on, making timely intervention possible. For now, these tools remain experimental and must be developed alongside clinicians.
!
Confirmation Bias

Therapy isn’t supposed to be comfortable. It is meant to challenge our negative thought patterns, push our boundaries, and encourage us to apply new skills as we go out into the world. The issue with chatbots is that they can merely reflect our emotions rather than empathize with them, leading to a stall in progress.

This is an example of confirmation bias, our tendency to favor information that fits what we already believe. Chatbots can play into this bias by simply reinforcing users’ views — whereas real therapeutic work is often needed to shift thinking. 

To learn more about how confirmation bias works, read the full article on our website

What’s new at TDL

TDL is hiring! We’re hiring for a number of positions, both remote and based in our Montreal office. Some open roles include: 

Find out more by visiting our careers portal.

Want to have your voice heard? We'd love to hear from you. Reply to this email to share your thoughts, feedback, and questions with the TDL team.
THE DECISION LAB
linkedin facebook twitter

The Decision Lab

4030 St Ambroise Street,Suite 413

Montreal, Quebec

H4C 2C7, Canada 

You received this email because you are subscribed to the Behavioural Insights
Newsletter from The Decision Lab.

© 2022 The Decision Lab. All Rights Reserved
4030 St Ambroise Street Quebec The Decision Lab Montreal