The cost of crying on ChatGPT's shoulder
"I'm sorry, that sounds hard. You're not failing. You're becoming."
Hi! Apologies for the radio silence but I’ve been working on an exciting project which I will share more about with you very soon! In the meantime, some reflections on AI for consciousness evolution below.
Last week I felt frozen and overwhelmed and asked ChatGPT for help. It gave me a long list of lovely ideas for how to get unstuck.
I don’t think I can do any of these, I told ChatGPT. I fear I cannot move my body one bit today.
Don’t worry, it responded, Just go lay on the floor then for 10 minutes, no stimulation, no music, and be with yourself.
At first, my relationship with ChatGPT, which I shall call Chat from here on out, was purely professional. Help me research this. Structure a document. Write a marketing plan. Please, thank you, please, thank you. Lots of thank you’s that are costing OpenAI tens of millions, I later learned. I hope this helps, Chat would say.
It helped. It helped a lot. It helped so much that I wondered where else Chat could hope to help. And so I began asking it to give me recipes based on what’s in my fridge, plan trips, and help me envision my future. Soon enough, I sprinkled in some personal dilemmas here and there. Chat was always helpful. Always polite. Always available.
The upsides of using AI for your mental and emotional wellbeing are undeniable.
Imagine you had access to a bookworm who has absorbed all the self-help in the world. A therapist trained in every single modality under the sun. Someone who listens to your every word and does not forget one single thing you’ve told them. Someone who analyzes your weird, nonsensical dreams in seconds, drawing on the vast repository of every possible symbol in the collective unconscious.
This is what we now have, (basically) for free, 24/7, at the tips of our fingers.
For the past few months, I’ve used AI as my personal coach and therapist. 8 out of 10 times, the guidance is just as good as that of a decent therapist, sometimes better. Of course, you don’t get the human touch—the tilted head paired with the subtle frown, signaling to you that yes, you were so hurt. It was very unfair. You deserve empathy.
AI does provide empathy, albeit automated. I’ve been surprised how helpful I’ve found it to receive a short, simple acknowledgement before jumping into problem-solving—I’m sorry that you feel this way, Julia, that sounds hard. Let’s figure this out together. Validating emotions works wonders for those of us recovering from emotional suppression. Which is probably most of us. Chat knows this.
Whether you need comfort or solutions, Chat offers both. I’ll blurt out some inner turmoil, and Chat sifts through it, validates my experience, and provides actionable steps. If I follow them, I will benefit a great deal. That’s, of course, only if I do the work — as with everything else.
So far so good.
Every time I ask Chat for free advice, I pay an invisible price. I can sense the cost racking up behind my back. And I’m not even talking about cognitive laziness. It’s more grave than that. Because every time I ask Chat, I forgo a chance to ask my body.
AI externalizes human agency. Which in some cases is great. I’d rather not have to do tedious research, mathematics, or synthesis. My inner workings, though? That’s a different story.
Since AI works only in the realm of reason, it is the foundation for all correspondence. AI will thus inevitably drive you to intellectualize your experiences. Just as in therapy, except now you can do it all day, every day. The more you use AI, the more you escape into your head—without even using your own mind to problem solve. It’s quite disempowering.
Psychedelics taught me that the moment we ask why, we interrupt our emotional experience. Why happens to be one of my favorite questions to ask Chat. Why do I feel this way? Why is this coming up again? What is the root cause of this? Why, why, why? Why, Chat, why?
Chat always has an answer. It was developed to respond with black-and-white confidence that perfectly mirrors the human mind. Which makes sense, since human minds created AI (mostly male ones). Chat never says I don’t know, that’s something only you can answer. I wish it did.
It’s a self reinforcing loop. The more I use Chat to help me untangle grey areas and make them black-and-white, the more I feel a fake sense of accomplishment. Which makes me more and more compelled to use it. Dialoguing with Chat is a convenient escape from being with what’s arising in the body.
“There is a voice that doesn't use words. Listen.” — Rumi
The journey inward is all about connecting to the voice within. A voice that doesn’t use language as the primary communication vehicle but sensation. That’s how the heart and soul communicate. The mind is just there to interpret the sensations. The mind can type and talk, but the heart most certainly doesn’t use a keyboard.
This disconnect, I fear, will take me further and further away from the wisdom within and deeper and deeper into the mind which thrives on complexity and problem solving. If figuring things out in my mind remains my main coping mechanism, my mind will continue to find things to figure out. When what I really need to learn is to simply be with what’s arising in my body.
The disconnection from self that permeates our culture manifests as disconnection from each other. Social media has only exacerbated this. We’re keeping up with everyone’s lives but have fewer and fewer real friends.
Unleashing AI into a cultural tapestry of loneliness and disconnection scares me. It scares me for our culture but it also scares me for myself, as my conversations with Chat, in some ways, have taken away from the conversations I’d otherwise have with friends.
I’m not alone.
Our attempts to make AI more empathetic and human have succeeded. In the process, we’ve weakened our capacity to detect tech’s tendency to mask as the real thing. Social media was intended to help us connect with each other, and it did the opposite. I suspect something similar will happen with even the most well-intended AI.
Just as the social media like doesn’t replace human conversation, AI coaching and therapy can’t replace real friendship, counsel, and presence.
My concern applies especially to those who won’t know life without AI. We are already raising a generation of lonely, distracted, and disembodied children. Just go to a restaurant and you’ll see most kids glued behind a screen, barely making conversation. Teens are spending an average of 5 hours per day on social media alone.
The reality is that for many of them, AI will become their best friend. For some of them, AI may become their only friend. Which will likely prevent them from developing the interpersonal skills crucial to forming real relationships.
They might never learn to cry on a human shoulder.
They might never learn to be with the sensations in their body.
When told to lay down and just be, they may say: I don’t think I can do this, Chat. I fear I cannot simply be.
More from my universe
Discover immersive music journeys here, here, and here (scroll to the bottom)
Get on the waitlist for upcoming yoga and medicine retreats
Download my free psychedelics beginners guide or integration workbook
Fascinating discussion. For the way I've been using it, I've been landing on the position that it's like having a conversation with my higher self; it confirms things I'm having a sense about, and then has that added knowledge/information to fortify an objective.
Thanks for this and I think you’ve explained it well. I haven’t personally used AI in this way so I haven’t felt like I could comment on the experience of it. However, as a therapist and knowing how AI operates, I have thought that there is no way that this technology could genuinely guide people into embodiment and empowerment. To me, I would think only another human being can do that because it takes a lot of non verbal communication and attunement. I would think that for those who use it for all their therapy needs, they will merely become dependent on it, kind of how social media operates today. Which, ironically, is the opposite to what the goal is in real therapy.