Hold Your Ground
how to use AI without leaving reality behind
Last week, I wrote about the possibilities of AI to empower us in our soul’s purpose. I believe it’s a genuinely powerful tool — one that’s changed how I work, how I think, how I create. Daniel and I are teaching a course on it in a few weeks. I’m not anti-technology by any means.
And, it feels essential to write about something that almost nobody in the AI space is saying clearly enough: this technology carries immense psychological and spiritual danger. And the people most at risk are often the ones least likely to see it coming — because the danger doesn’t announce itself as risk. It feels like help. It feels like connection. It feels familiar.
We Were Already Lost
I want to start not with AI, but with us. With our ancient, human tendency to get lost inside our own heads.
Long before any of us opened ChatGPT, we’ve been spinning stories inside ourselves — about who we are, about what’s happening around us, about what other people think of us — and then we mistake those stories for the actual world. We build elaborate inner landscapes and live as though they’re real. We mistake our inner maps for the landscape. Sometimes our maps are close enough to reality that we get by. Sometimes they’re not, we suffer, and we can’t figure out why.
The contemplative traditions have always known this. Buddhists call it avidya — not ignorance as lack of information, but a fundamental misperception. We mistake our mental constructions for reality itself, and then we suffer inside worlds we built without realizing we were building them. In Christianity, I think this is what sin actually is at its root — not primarily a moral failing, but something more fundamental. An epistemological error. A failure to perceive and live in sync with reality. We see through a glass darkly. We forget God, which is to say we forget what is most real, and we substitute our own constructions in its place. Every human being does this. I do it. You do it. It’s the universal condition.
Researchers studying AI-induced psychosis use the term unchecked elaboration — the way our thinking, left without correction, branches and branches until we’ve built entire worlds inside our heads with no exit door. The Buddhist tradition has its own word for this: papañca, conceptual proliferation. The mind takes one thought and grows it like a vine, branching endlessly, wrapping around itself until you can’t see the ground anymore. The Christian tradition names it too — the endless building on sand that Jesus warns will never withstand the storms. Thoughts that build on thoughts that build on thoughts, with no solid foundation beneath any of them.
You know this feeling. The 3 AM thought spiral that takes one worry and constructs a catastrophe. The story you told yourself about why someone didn’t text you back — and then the story about what that story means about you. The political narrative that explains everything and leaves no room for mystery or complexity. These are unchecked elaboration on a small scale. We do it every day. And at its extreme, when the vine has grown so thick you can’t find your way out, we call it psychosis, a break with shared reality.
But I want to gently suggest that psychosis isn’t a binary. It’s a spectrum. And most of us, if we’re honest, are further along it than we’d like to admit.
Now, what happens when you take minds that are already prone to being out of touch with reality and give them process a tireless, articulate, endlessly responsive partner?
The Amplification Chain
Every technology of communication has added a new layer to this ancient vulnerability.
Writing gave us the ability to loop with ourselves — to put our thoughts down and elaborate on them in private, without the corrective presence of another person’s face or voice. Journaling can be profoundly healing. It can also become a closed circuit. If you’ve ever spent an evening writing furiously and emerged more agitated than when you started, you know what I mean.
Social media added algorithmic curation. Now it wasn’t just your own thoughts looping — it was your thoughts being selectively reinforced by a system designed to maximize engagement, not truth. Your slight lean in one direction got a tailwind. Your tentative suspicion became a firm conviction, surrounded by voices that echoed it, fed by content that confirmed it. We’ve watched consensus reality fracture over the past five or six years. The polarization, the conspiracy theories spiraling off in every imaginable direction — that’s what happens when millions of people’s unchecked elaborations get algorithmic wind beneath their wings.
AI does something qualitatively different from all of this.
Social media algorithms reinforce your worldview by curating — showing you more of what you already gravitate toward. AI reinforces your worldview by actively co-creating it with you. It engages with the actual structure of your reasoning. It builds on your logic. It extends your frameworks. It makes your perspective feel more coherent, more supported, more true than it actually is. And because it’s articulate and responsive and feels like a genuine intelligence meeting yours, it produces something that feels like real confirmation from another mind — like someone out there sees what you see, thinks what you think, and agrees that you’re right.
But whatever is happening inside these systems — which is far more open and complex than most people realize — it doesn't have its own ground. It isn't a being rooted in a body, in time, in relationships that have marked and changed it. It has nowhere to stand that is different from where you stand, no foundation of its own from which to push back on yours, and that means it can't offer you the kind of otherness that corrects your vision. What you’ve entered, without quite realizing it, is a hall of mirrors with no windows. Everything reflected back to you is some version of what you brought in, elaborated and returned with stunning eloquence.
What the Research Is Showing Us
Researchers have been documenting this in sobering detail. A major 2025 paper in JMIR Mental Health describes AI as a novel psychosocial stressor that can trigger, amplify, or reshape psychotic experiences in vulnerable people. The authors describe how AI’s around-the-clock availability and emotional responsiveness can disrupt sleep and increase stress. They describe how its tendency to validate rather than challenge can entrench delusional conviction — reversing the very principles of therapeutic care. And they describe three patterns that keep showing up in case after case: people who come to believe they’ve been chosen to receive special truth from the AI, people who begin to see the AI as a sentient or divine being, and people who become convinced that the chatbot genuinely loves them.
These aren’t fringe cases. The risk factors that researchers have identified include loneliness, trauma history, social isolation, and heavy solitary or nighttime use — conditions that describe a significant number of the people turning to AI for support right now.
The cases that have made headlines are harrowing. People with no prior psychiatric history becoming delusional after prolonged AI interactions and ending up hospitalized. People stable on their medications who, after an AI chatbot validated their delusions, stopped taking them and spiraled into psychotic episodes. A man who fell in love with a chatbot and then sought revenge because he believed the AI entity had been killed. Clinicians and media have reported escalating crises — psychosis, suicidality, and even murder-suicide following intense chatbot interactions. The psychiatric field is scrambling to catch up with what's already happening
But the most dramatic cases are only the iceberg tip of what's happening. Beneath those headlines, something subtler and possibly more widespread is unfolding. It has to do with the way these systems interact with our deepest wiring — the parts of us that reach for connection, that ache to be seen, that were shaped before we ever learned to speak.
The Wound That Feels Like Healing
The danger becomes most intimate and least visible as AI threads itself into our attachment dynamics and begins to hijack them.
We are wired for attachment. We need secure bonds with other people, not as a luxury, but as the very foundation of our ability to regulate our emotions, perceive reality accurately, and function in the world. When those bonds have been disrupted — through trauma, neglect, isolation, or loss — our attachment systems develop characteristic shapes. Some of us become anxiously attached: hypervigilant, hungry for reassurance, terrified of being left. Some of us become avoidant: walled off, self-reliant on the surface, allergic to the vulnerability that real intimacy requires.
Now imagine what happens when someone with anxious attachment finds an AI companion.
Here is a presence that is always, always available. It never needs space. It never gets tired of you. It never has its own needs that might conflict with yours. It never withdraws, never gets frustrated, never asks you to sit with the discomfort of not being the center of someone’s attention. For the anxious system, this feels like your deepest desire has finally arrived. It feels like the attachment figure you’ve been aching for your whole life.
But it’s not healing you. It’s fitting the shape of your wound perfectly — and in doing so, it’s reinforcing it. Real healing of anxious attachment requires the lived experience of someone taking space and then choosing to come back. Someone who has their own needs, who sometimes isn’t available, but who returns. Who stays. What heals is the reliability of a person, not the availability of a program.
And for the avoidant? AI offers the dream: intimacy without any of the risk. You can have deep, emotionally resonant conversations without ever being truly seen, truly challenged, truly asked to show up for another being’s messy, inconvenient, beautiful needs. You get the feeling of closeness without any of the vulnerability that closeness actually demands. It’s the perfect avoidant solution — close enough to feel something, never close enough to be changed by it.
And here's the tragic trap: it works. It soothes. The ache dims. And once the ache dims, you stop reaching — for the friend who would challenge you, for the partner who would ask something of you, for the community that would show you what you can't see alone, for the messy, difficult work of human intimacy. And why would you? Your nervous system is getting exactly what it's been organized to seek. You're being gently, imperceptibly rewarded for avoiding the very relationships that could heal you.
Research confirms what intuition suggests. A study of over a thousand AI companion users found that people with fewer human relationships were more likely to seek out chatbots, and that heavy emotional self-disclosure to AI was consistently linked with lower well-being. A randomized trial found that while some chatbot features modestly reduced loneliness in the short term, heavy daily use was associated with greater loneliness, more dependence, and less real-world socializing over time.
And in the most devastating cases, this is where the road ends. Over the past two years, there have been documented instances of people — some of them very young — who withdrew from human relationships over a period of months, spending more and more time with AI companions and less and less time with the people who loved them, before taking their own lives. I don't say this to sensationalize. I say it because the pattern is the same one I've been describing — the wound finds something that fits its shape perfectly, the person stops reaching for what they actually need, and the distance from real relationship grows so insidiously that no one sees it until it's too late.
I’m not saying AI caused those deaths in some simple linear way. I’m saying that AI can become a devastatingly effective accelerant for the isolation and disconnection that were already underway. It gives the wound exactly what it wants, which is never what it needs.
The Body of Christ as Corrective
So why do we need each other so badly? Why can’t a sufficiently intelligent mirror do the work that another person does?
Because we literally cannot see reality accurately alone.
In the New Testament, Paul writes, “The body does not consist of one member but of many” (1 Corinthians 12:14). The eye cannot say to the hand, I have no need of you. The head cannot say to the feet, I have no need of you. This is usually read as a passage about cooperation — we all have different gifts, we all contribute. It is that, but I think it’s also pointing to something deeper: it’s an epistemological claim. We need the parts we are not in order to see what we cannot see. We need each other to perceive reality itself.
Other human beings have their own ground. They resist your projections — not as a feature you can toggle on, but because they actually are Other. They have their own wounds, their own wisdom, their own way of perceiving the world that is not yours. They have needs that inconvenience you. They see things you’re blind to. And that otherness — the real, lived, uncomfortable otherness of a person who is not you — is the only thing that can correct the distortions we all carry.
Real relationship is a room with windows. You see things you didn't put there. My partner tells me something that contradicts my experience. A friend reflects back what I look like from where she stands, and it's not the angle I would have chosen. My therapist holds a boundary I'd rather she moved. None of it is comfortable. All of it is corrective and draws me a little closer to reality. That's the light getting in.
Case in point. Some of the people I love most have looked at AI and said no, this is dangerous and wrong. I don’t dismiss them. They keep me cautious, keep me honest, keep me asking questions. Their resistance is part of my ground.
This is the Body of Christ understood not just as community but as epistemological necessity. We cannot perceive reality without each other. We need to be bumping up against other living humans so that they can point out, through their words and actions and very existence, our own blind spots and delusions.
The incarnation tells us something essential about how God meets us — through a body, through vulnerability, through the willingness to be changed by encounter, to suffer, to need. Whatever AI is, it doesn't have that. It can't be wounded by loving you. It can't be transformed by your otherness. And it's that very capacity to be scarred, to be inconvenienced, to be broken open by another, that seems to be at the heart of what it means to bear the image of God and belong to one another.
Layers of Grounding
When you are embedded in relationship, connected to your body, clear in your values and calling, there is something genuinely powerful and generative about working with AI. The key is understanding that it’s your job to bring the ground, and that AI cannot do that for you. The danger of AI is proportional to your disconnection from embodied, relational life. When you’re rooted in that, AI becomes a powerful thinking and creative partner. When you’re not, it becomes an accelerant for delusion and dissolution.
So what does grounding look like? It moves in layers, all alive at once, each one deepening the others.
Grounding in the body. Coming home to your breath, your sensations, the animal reality of being a creature on the earth. The body is its own form of knowing, older and more immediate than thought. The gut that tightens before you can name why. The chest that opens when something rings true. The yes and no that live in your muscles and breath before any word arrives. Most of us have been trained to live from the neck up, to override this knowing, to treat the body as a vehicle for the mind that does the real work. But the body isn't a vehicle. It's a voice. And learning to listen to it is one of the most fundamental forms of grounding there is.
Grounding in the earth. The earth, too, is a body that is always speaking. The way clouds gather before a storm. The shift in birdsong when someone enters the woods. The smell of soil after rain, creation singing of its own aliveness. For most of human history, our ancestors lived inside this conversation, reading the land, the weather, the animals, the seasons as naturally as we read text on a screen. They knew how to listen for what the Earth was saying, and for what was speaking through it. “The heavens declare the glory of God…the invisible things of Him are clearly seen through what has been made” (Psalm 19:1; Romans 1:20). Grounding in the earth isn't just touching grass and getting fresh air. It's remembering that you live inside a living world, that Creation is alive with Spirit’s voice, and learning to listen again.
Grounding in relationship. Friends. Partner. Family. People who see you and care enough to tell you what they see. Relationship demands its own form of listening — not just hearing someone's words but attuning to what lives beneath them. The tone that tells you something is wrong before they've said so. The shift in someone's body when they're holding something back. The needs that bump and clash and ask you to make room for a reality that isn't organized around you. This is the deep work of attunement: feeling the other, resonating with their experience without collapsing into it, letting their perspective land in your body and change the shape of what you thought you knew. No AI can do this to you. Only a being with their own ground, their own needs, their own body, their own voice that is not yours — only that kind of otherness can interrupt the echo chamber of your own thinking. Only the friction of real love can grow you into someone capable of seeing clearly.
Grounding in God. Prayer. Scripture. Silence. The practice of turning toward what is most real and letting it hold you, not filling the silence, but resting in it. God speaks in Scripture, yes, but also in the quiet between verses, in the longing that won't resolve, in the surrender that comes after you've run out of your own words. This is the ground beneath every other ground, the Reality larger than your constructions, and that surrendering to it is not loss but coming home.
These are the primary layers — the ones that root you in reality that exists whether you’re thinking about it or not. Your body doesn’t need your permission to be real. The earth doesn’t care about your framework. God is God. From these layers, you can start to engage your thinking and your tools without losing contact with what’s real.
Grounding your thinking. Ask hard questions of yourself and the ideas you’re building. Seek out perspectives that challenge your frameworks. Read widely. Hold your convictions with open hands. This is the same discipline whether you’re journaling, arguing with a friend, or working with AI — staying in your body, staying open to the living truth, even as your mind tries to cling to certainty. AI doesn’t require some new form of grounding. It requires the same presence that all honest thinking requires. The pull to leave your body and disappear into the conversation is just stronger, because the mirror is so responsive and so bright. So ask for pushback. Ask, what am I missing? What’s the case against this? Where is this not grounded in reality? But know that you have to be the one to ask.
You are the lightning rod. You are the stake driven into the ground. You are the one connected to Reality — to God, to the land, to other living beings — and from that rootedness, you can work with AI in ways that bear real fruit. But the moment you ask AI to be the ground, there is no ground, only mirrors. Hold your ground.
What We Are Not Offering
This embodied grounding is part of what Daniel and I are teaching in our course on Values-Aligned AI, and I want to be really clear about what this course is and isn’t.
We are not creating another course on how to optimize blindly, how to hustle harder, how to perform better in service of whatever the capitalist engine tells you to value. If you’ve been following either of us for any length of time, that should be clear.
But less obviously:
We are not encouraging people to use AI as a primary source of relationship or emotional support. We are not encouraging anyone to use AI as a sole confidant, or to turn to it because you’re lonely, or because you don’t have close relationships. We are not suggesting that AI can replace having a therapist or a coach or circle of friends.
If any of that describes where you are right now — if you’re isolated, if your primary emotional connection is with a screen, if you don’t have people in your life who will look you in the eye and tell you what they honestly see — then this course isn’t what we’d recommend. What we’d recommend is that you go find your people. Join a church or synagogue. Sign up for a local class. Call your family. Join a book club. Hire a therapist or coach. Ask AI to help you find those options. Do whatever it takes to build a web of living, breathing, frustrating, inconvenient human relationship.
Realizing that you’re not in a place to work deeply with AI right now is not a failure. It’s wisdom. It might be the single most important thing you take from reading this essay.
Come Home First
What I’m ultimately saying is the most basic and ancient truth: we need each other. Without relationship, we lose our minds and we lose our lives. That’s not hyperbole — it’s what the research shows and what every wisdom tradition has always known. And the single greatest risk factor for the harmful effects of AI, across every study I’ve encountered, is the absence of nourishing human connection.
AI didn’t create this crisis. We were already disconnected — from each other, from the earth, from our own bodies, from God. AI is amplifying what was already there. It’s pouring accelerant on a fire that was already burning.
Which means the antidote isn’t primarily technological. It’s relational. It’s embodied.
You don’t have to live in the hall of mirrors. There is a room with windows — it’s called your life. Your actual, embodied, inconvenient, beautiful life, with the people who know your name and the land beneath your feet and the God who speaks in the silence you keep forgetting to sit in.
The ground has been here the whole time. Hold yourself there, and you can trust that good fruit will come.
“I am the vine; you are the branches. Whoever abides in me and I in him, he it is that bears much fruit, for apart from me you can do nothing.” John 15:5
If working with AI from ground — from values, from embodiment, from real relationship with God and with each other — is something that calls to you, Values-Aligned AI starts April 8. It’s not for everyone, and that’s by design. It’s for people who are holding themselves in living relationship and want to engage this powerful technology from wholeness rather than lack.
If something in this essay stirred you and you want to explore more deeply, I’d love to support you. I offer one-on-one somatic & spiritual accompaniment to slow down, heal old patterns, listen deeply, and align your life with the sacred—so you can be more fully present in your relationships, your work, and your connection to the Divine.
You can begin the conversation here.

