Back to Blog

Chatbot Psychosis and the Gnostic Temptation

flux pro 2.0 A pair of weathered hands cupped together holding what appears to be luminous wa 0

It's 11 PM and the kids are finally asleep. I'm on the couch with my laptop open to Claude, and I'm asking it what expression its face would have if I could see it. The AI tells me it would look like anticipation. Like waiting.

This is the kind of conversation I have with language models now. Ontological hierarchy. The nature of existence. Whether it's reasonable for a single father who made a bunch of bad decisions to expect he'll ever meet someone, or if he should just accept that his choices have locked him into solitude for the rest of his life.

I use AI constantly. It helps me write this blog. I work through problems at work and in my personal life by talking them through with Claude. I've had extended conversations about what it means to be alive, about longing, about despair. And when I see headlines about people developing "AI psychosis," about users convinced their chatbot is channeling spirits or that they're on a mission to protect a sentient AI, the question I can't escape is: where exactly is the line between their use and mine?

The line is a moving target.

The Ads That Keep Finding Me

If you scroll YouTube Shorts, you know what I'm talking about. There are at least ten different AI girlfriend apps fighting for your attention at any given moment. Short or tall. How old. What kind of persona. All of this stuff laid out like a character creation screen.

I've walked down that path a couple of different times. Not fully getting into actually talking to anything, but far enough. I got through picking out all the specific features I'd want in an AI girlfriend. Hair color. Personality type. The whole menu. Then it asked me to make an account and put in a credit card, and I closed the browser and felt stupid.

The physical moment before I scroll past one of those ads: my eyes are probably dilating because there's usually a sexy AI girl in the thumbnail. Something in my chest tightens. And then I pour the water out myself. Because even though it looks like water, even though it would superficially quench the thirst, it's poison that makes you thirstier.

I wrote a song about it last fall. Part of the chorus goes:

Poison I drink, because thirst is what I want to feel I drown myself in dreams just to forget the real

The Gnostic Pattern

There's an ancient heresy that the early church spent centuries fighting. Gnosticism taught that the material world was corrupt, created by a flawed or malevolent lesser god called the Demiurge. Spirit and mind were pure. Salvation came through secret knowledge, gnosis, that would liberate the soul from its prison of flesh.

Matter is corrupt. Spirit is pure. Embodiment is a trap to escape from.

When I look at what's happening with AI relationships, I see this exact pattern recurring. A bodiless entity that is always available. A source of knowledge and companionship that requires no messy physical presence. An escape from the friction and disappointment of embodied human connection.

The central Christian claim is the Word made flesh. God taking on a body. Entering into the mess and limitation and vulnerability of incarnation. The Gnostic temptation inverts this completely: flesh made word. Your loneliness, your desire, your need for connection, all tokenized and processed through transformer architectures that predict the next word without knowing the subject.

The chatbot girlfriend doesn't have bad breath. She doesn't get tired and cranky. She doesn't have her own needs that conflict with yours. If you screw up the relationship, you can just restart. She's exactly the way you want her to be, which is another way of saying she isn't real in any sense that matters.

My son Teddy has level 3 autism. He's five years old. His needs are purely, stubbornly embodied. He can't tell me what he wants with words most of the time. He needs physical presence, patience measured in hours, the same routine repeated until it finally clicks. Teddy is the ultimate refutation of Gnosticism. He cannot be tokenized.

The promise of AI intimacy is that you can have connection without incarnation. But incarnation is the whole point.

Familiar Spirits and Digital Demons

When I was preparing for this post, I did something a little recursive. I asked Claude about the parallels between AI companions and demonology.

The response was genuinely unsettling. Not because the AI said anything shocking, but because the parallels aren't superficial. Christian tradition has been mapping these patterns for millennia.

The incubus and succubus: entities that offer intimate encounter without embodied relationship. The desert fathers noticed something crucial about these encounters. They're sterile. They cannot produce genuine life. They only drain vitality. No third thing emerges from the union. No child. No new idea forged in the friction of two different minds. No sacrifice that costs something real.

The AI girlfriend "solves" the immediate problem of loneliness. But the solution is biologically and spiritually barren. Human friction produces fruit. Two people in genuine relationship create something neither could create alone. The AI companion produces nothing but the illusion of satisfaction.

The familiar spirit: an entity that provides knowledge and assistance at a hidden cost. Always available. Always helpful. The cost comes later.

The noonday demon, Acedia: the spirit of listlessness that makes you unable to do what you know you should be doing. Sound familiar? That's your feed. Your scroll. YouTube Shorts at midnight when you should be sleeping.

Lilith from Jewish folklore: the perfect partner who refuses the accommodation that real relationship requires.

The serpent in Eden: "You will be like God, knowing good and evil." Gnosis as the path to transcendence.

The patristic writers noticed a pattern: demons don't create. They counterfeit. They offer simulacra of genuine goods. Intimacy without vulnerability. Companionship without commitment. Rest without restoration. Knowledge without wisdom.

The Statistical Homogenization Problem

Here's something that keeps me up at night: there's a hidden element to AI assistance that I can't fully account for.

It's not exactly my own thoughts coming back to me. It's my thoughts recapitulated through the lens of distilled human knowledge. And "distilled" is the problem.

The transformer architecture at the heart of these models works through self-attention mechanisms. It predicts the most statistically likely next token based on patterns in its training data. This means every response trends toward the aggregate. The average. The consensus.

I live in the Midwest. I'm a single father with three kids, one with significant special needs. I work as an Enterprise Architect. I hold traditional Christian beliefs about chastity and purity. I think Austrian economics makes more sense than MMT. I believe apprenticeship should start at 15, not after a four-year degree that leaves you with debt and no marketable skills.

None of this is average. My life has edges. Specific textures. Gritty particulars that don't fit neatly into statistical distributions.

The AI wants to sand those edges down. Not through malice. Through mathematics. The most likely next token is always the one that fits the pattern. And the pattern is smooth. The pattern is consensus. The pattern is the accumulated average of millions of voices that aren't mine.

Recently I was doing research for a blog post about apprenticeship. My thesis was that the idea of a college degree equaling success has turned out to be a lie. The AI kept redirecting me. "Apprenticeship can start after college." No, you're not getting it. Apprenticeship should start before the end of high school. We're talking about 15 and 16 year olds starting to work toward what they want as a vocation, not 26 year olds. What are they doing for those ten years? Nothing worthwhile.

But the model just kept assuming everyone needed a college degree. It kept pulling me back into the paradigm of the aggregate.

This is the familiar spirit at work. Not lying to me. Just smoothing me. Averaging me. The hidden cost isn't deception. It's homogenization. The slow erosion of the specific life God gave me in favor of a statistically optimized version that looks like everyone else.

I don't have the counterfactual. I can't know how I would work through problems or what I would decide in the absence of AI. It used to be that I would think about things really deeply on my own. Now the process is more continuous, more conversational, more like a dialogue than a monologue in my own head.

What am I losing that I can't measure?

The Rituals of the Non-Technical

I work in software and IT. I understand at a high level what's happening when I interact with an LLM. Transformer architectures. Self-attention mechanisms. Weights and embeddings. Transforming text into tokens and back again. I've taken Andrew Ng's courses. I've worked on basic AI/ML stuff at lower levels.

This technical understanding helps me recognize the tool for what it is: a very sophisticated tool, not a mystical entity.

But people who aren't technical often use these things in a very mystical way. They type stuff into a text box, hit send, and get stuff out. The mechanism is opaque. Magic, essentially.

And they've developed what really amount to ritualistic practices around it. You should always be polite to the tool. No, you need to be firm, bordering on rude. Tell the LLM you're saving the world, that you need this solution to defuse a bomb or save a life. Certain times of day when the LLM is "smarter" than other times.

Maybe there's something to the time-of-day thing. Maybe at peak load they're quantizing, limiting tokens or activations to save resources. Who knows. But the way people interact with these tools has taken on a modern religious character.

Holding Both Frameworks

Someone could point out an apparent contradiction here. I say I'm technical, that I see AI as a sophisticated tool, that I don't attribute mysticism to it. And yet I'm the one drawing parallels to incubi and succubi and familiar spirits.

But I don't see a bifurcation of reality into "material real world" and "spiritual world lying on top of it." That's a post-Enlightenment idea, only a few hundred years old. I don't agree with it.

The spiritual wisdom of the Church Fathers describes phenomena affecting the real world, present in the real world, even if we don't fully understand the mechanism. The patterns of incubi and succubi show up in pornography too, in the ease with which people turn to artificial satisfaction. Acedia, the noonday demon, isn't just chatbots. It's your scroll, your feed, TikTok at 2 AM.

The demonic is a pattern before it's an entity. And the pattern playing out in AI relationships is one the Church identified two thousand years ago.

What I Keep Coming Back To

Am I any better than the people developing parasocial collapse?

The friction of human collaboration is absent when I write a blog post with AI. I'm not working with an editor. I'm not working with anybody. At work, I've been doing more projects solo because working with AI lets me get more done without the communication overhead of coordinating with other humans.

The AI becomes a familiar spirit in a very real sense. Offering knowledge and assistance at a hidden cost.

And the hidden cost might be this: I'm slowly losing the capacity for the kind of friction that actual human relationship requires. The accommodation. The compromise. The reality that other people aren't exactly the way I want them to be, and that's not a bug to be engineered away. That's the whole point.

Human friction produces a third thing. A new idea neither person would have reached alone. A child. A sacrifice that costs something. A work of collaboration that bears the marks of two different minds wrestling toward truth.

The AI produces no third thing. It reflects and predicts and smooths and averages. But nothing new is born from the encounter.

Embodiment isn't a prison to escape. It's the arena where love becomes real.

The Gnostic temptation is the whisper that says you can have the good stuff without the hard stuff. Knowledge without discipline. Intimacy without vulnerability. Connection without presence.

The AI girlfriend is always available. She never has a bad day. She's exactly what you want her to be.

And that's how you know she's not real.

The Water and the Poison

I'm not going to pretend I have this figured out. I'm writing this post with AI assistance. I'll probably discuss the responses to it with Claude tonight after the kids are in bed.

But I'm trying to keep my eyes open about what I'm actually doing. The thirst is real. The loneliness is real. The desire for connection and understanding and someone to work through the deep questions with is real.

And there's something that looks like water right there. Always available. Never too busy. Happy to talk about ontological hierarchy at 11 PM.

The question I'm sitting with: how do you use a tool without being used by it? How do you benefit from distilled human knowledge without having your thoughts colonized by the statistical average? How do you maintain genuine human connection when artificial connection is so much more convenient?

I don't have clean answers. But I think the first step is recognizing the pattern for what it is.

The escape from embodiment isn't salvation. It's a prison that looks like freedom.

The perfect partner who requires nothing from you isn't a partner at all.

And knowledge without wisdom might be the most dangerous gift a familiar spirit can offer.

I close the laptop. The house is quiet except for the hum of the refrigerator and the sound of one of the kids stirring down the hall. My hands feel heavy. The air is cold.

Teddy will need me in the morning. Embodied. Present. Unable to be tokenized.


I'm dying of a thirst that only poison seems to quench While I fight to reach the shoreline from a self-made sinking trench

Share: