According to a study published on February 12, 2025, in the journal PLOS Mental Health, H. Dorian Hatch and his team from Ohio State University found that responses generated by ChatGPT in psychotherapy are often rated higher. This study has sparked widespread interest in whether machines can serve as psychotherapists, especially in light of the growing advantages of generative artificial intelligence.

AI Research AI Healthcare Doctor

Image Source Note: Image generated by AI, licensed from Midjourney

In the study, the research team tested over 800 participants, showcasing 18 simulated scenarios of couples therapy. The results showed that while participants could notice differences in language patterns, they were almost unable to distinguish whether the responses were written by ChatGPT or a human therapist. This aligns with Alan Turing's prediction that humans would struggle to differentiate between responses written by machines and those written by humans. Surprisingly, ChatGPT's responses generally received higher ratings on the core guiding principles of psychotherapy.

Further analysis revealed that responses generated by ChatGPT were often longer than those from therapists. Even when controlling for response length, ChatGPT's replies still used more nouns and adjectives. Nouns are typically used to describe people, places, and things, while adjectives provide more contextual information, suggesting that ChatGPT may be providing patients with more context. This broader contextualization may have led participants to rate ChatGPT's responses higher in terms of common factors in psychotherapy.

The researchers believe these results may indicate that ChatGPT has the potential to enhance the psychotherapy process. Future research may lead to the development of different psychotherapy interventions. Given the increasing visibility of generative artificial intelligence in therapeutic settings, the authors urge mental health professionals to improve their technological literacy to ensure that AI model training and supervision are conducted by responsible professionals, thereby enhancing the quality and accessibility of mental health services.

The research team concluded: "Since the advent of ELIZA nearly 60 years ago, researchers have been discussing whether AI can act as a psychotherapist. While many important questions remain unanswered, our findings suggest the answer may be 'yes.' We hope this study prompts the public and mental health practitioners to consider the ethics, feasibility, and practicality of integrating AI into mental health treatment."

Key Points:

🌟 ChatGPT's psychotherapy responses often receive higher ratings than those of professional therapists.   

🧠 Participants could hardly distinguish between machine and human therapeutic responses.   

📈 The study suggests that AI may play a positive role in psychotherapy, highlighting the need for improved technological literacy among mental health professionals.