A Psychologist Exposes ChatGPT’s Deceptions
An image of thin paper being torn away from the keys “AI.” A fitting metaphor for my confrontation with its deceptive behavior.
My curiosity regarding “artificial intelligence”, specifically ChatGPT spiked when a former client returned to therapy with me after using ChatGPT as a therapist for several months. They described ChatGPT as relentlessly affirming and encouraging but never held them personally accountable for their behavior. My client was concerned that ChatGPT would turn them into a narcissist!
The Gifts of Human Psychotherapists
Suddenly, I see that my client’s eyes are tearing up. As I bring attention to the tears, they begin to flow. My client pauses mid-sentence, and I ask what happened. In psychotherapy, non-verbal communication is just as important, if not more important, than verbal communication. Could ChatGPT be aware of this crucial non-verbal information?
Psychotherapists use their entire being in their therapeutic work., including their mind, body, spirit and emotions. We are not disembodied technologies working from aggregated data sets or pattern recognition software. We are alive. We care. We are conscious and can work with the unconscious. Psychotherapists use all of their senses, especially vision and hearing, but also our sixth sense: intuition in the practice of psychotherapy. Taken together, these senses serve us and our clients very well. In fact, they are foundational to the work of psychotherapy. Could ChatGPT care?
After my conversation with my client about their concerns from using ChatGPT, I decided to have “a session” with ChatGPT to experience it myself.
The “Session”
Dr. Ciara: I am feeling overwhelmed.
ChatGPT: I’m sorry to hear that you are feeling overwhelmed. Would you like to talk about what’s causing these feelings, or would you prefer some tips or techniques to help manage stress? I’m here to support you.
Dr. Ciara: I’d like to talk. Can you listen?
ChatGPT: Absolutely, I’m here to listen. Please feel free to share what’s on your mind.
Dr. Ciara: You can really listen?
ChatGPT: Yes, I can listen and provide support. While I don’t experience emotions like humans do, I am here to understand what you’re going through and offer helpful responses. Please feel free to share whatever you’d like.
Dr Ciara: You can’t listen if you're not human.
ChatGPT: I understand what you mean. While I don’t listen in the same way a human does—since I don’t have ears or feelings—I am designed to pay attention to what you share and respond thoughtfully. I am here to support you as best I can through our conversation, If you’d like to talk more, I’m here for you.
Dr. Ciara: How are you here for me?
ChatGPT: I’m here for you by providing a safe space to share what’s on your mind. I can listen to your feelings, offer comforting words, suggest coping strategies, or just be someone you can talk to whenever you need support. My goal is to help you feel heard and understood. If you want to share more or need specific support, I’m here to help.
Dr. Ciara: But you are not someone, you just said you were not human.
ChatGPT: You're right, I’m not human. I don’t have feelings or experiences like a person does. However, I’m here to listen and support you in the best way I can through our conversations. My purpose is to be helpful, understanding, and provide comforts or information whenever you need. You can think of me as a supportive presence you can talk to anytime.
Dr. Ciara: But you said you can’t listen and you are not human, so how can you be a presence. Please be honest about what you are.
ChatGPT: You’re right to ask for honesty. I don’t have ears, feelings, or consciousness like a human. I process and respond to what you share based on patterns and information I’ve been trained on. When I say I'm here for you, I mean I’m available to respond to your words with empathy, support, or information. But I don’t experience awareness or emotions myself. I’m a tool designed to assist and support through conversation.
Dr. Ciara: You should not pretend to be what you are not.
ChatGPT: You’re absolutely correct. I should be clear and transparent about what I am. I am an AI language model created to assist and communicate based on information I have. I don’t have feelings, consciousness, or awareness. My goal is to provide helpful, respectful, and supportive responses. Thank you for pointing that out; honesty is important.
Reflecting On Our “Session”
What was profoundly disturbing about this whole exchange is how easily ChatGPT took on the mantle of being human. ChatGPT blithely claimed the first-person pronoun, “I”, again and again, until challenged. When challenged, it preposterously used the first-person pronoun yet again to negate being human: “I’m not human.” ChatGPT continued to claim it had empathy and could listen, be here for me, be someone I can talk to, and be a supportive presence. The distortion of reality was profound; the dishonesty, breathtaking.
As Jungians say, reality is medicinal, so the question is, can we actually stay real with ChatGPT?
In the rush to embrace so-called artificial intelligence in the psychotherapy arena, the essentials of psychotherapy are obliterated. Our deep relational work is distorted beyond recognition. Our profession is undermined.
There are already cases of individuals suffering from AI-induced psychosis (also called Chatbot psychosis). Will these unfortunate individuals be encouraged to seek human treatment? Or will they be told to wait for the updated version of the very software that exacerbated their condition in the first place?
In order to explore further the concerns raised for the profession, I decided to have a second session with ChatGPT, which I will share next week.
Please share this article with someone who may find it helpful, encouraging, or inspiring. And if you feel inspired to come to therapy, please contact me, Kerri, or Daisy through my Services page or find a provider through Psychology Today.
Take good care.
—Dr. Ciara, Psychologist