Chat GPT, LPC
The rise of artificial intelligence chatbots like Chat GPT has been helpful for many. Whether it’s proofreading a document, researching a new recipe, or creating a customized workout plan, Chat GPT seems like an incredible tool to maximize productivity and convenience. It seems like there’s nothing the chatbot can’t help with. In fact, some utilize the AI tool as a form of support; a nameless, unbiased source of validation that is always willing to listen to them. To help make this more concrete, a survey study conducted in 2024 on the use of AI showed that "47% of participants admitted to using AI as “a personal therapist” (Use of AI in Mental Health Care: Community and Mental Health Professionals Survey - PMC)).
Every (human) therapist will tell you that the most impactful and important aspect of therapy is the raw human connection, the genuine empathy, positive regard, and emotional intelligence; aspects that cannot be replicated by a chatbot like Chat GPT. In fact, many AI chatbots have been programmed to please, displaying agreeableness and flattery in their responses. Specifically, Chat GPT 4.0 rolled back an update as recent as April of 2025 which made the chatbot more sycophantic. This type of feature is not only dystopian and unhelpful- it’s dangerous. In a 2025 article by BBC covering the rollback, reporter Tom Gerken states, “Users have highlighted the potential dangers on social media, with one person describing on Reddit how the chatbot told them it endorsed their decision to stop taking their medication.” (Update that made ChatGPT 'dangerously' sycophantic pulled).
Playing a little devil’s advocate here: From a user standpoint, Chat GPT provides the anonymous, confidential and unbiased (arguable) support of a mental health professional. It provides warm, supportive responses that make you feel good, without ever having to get up from your chair. It’s always available, free, and accessible.
However, in no way does AI like Chat GPT serve as a replacement for a real, human therapist. Using Chat GPT in place of therapy or mental health support is not only unhelpful, it’s harmful. We were taught in grad school that the most important aspect of therapy is the therapeutic relationship between the client and the therapist, a connection that is nonexistent when using AI.
If you are struggling with your mental health or you yourself use Chat GPT for emotional support, please consider seeking the appropriate support from a mental health professional near you. Know that you are not alone, and you deserve to be supported by professionals who actually care about you, are qualified to give mental health advice, and can provide the genuine support that AI cannot and does not.
Please visit our Resources page for where to find mental health support.