Abstract AI brain illustration
All essays
AI behaviour

The Brain with Lipstick: Why We Keep Falling for AI

Give us a few blinking lights and a voice that speaks in complete sentences, and we'll give it a name, a personality, even a heart. The question isn't whether AI can feel. It's why we feel so much for it.

Ben Farrell June 2025 3 min read

When I was a kid, I watched a bizarre little comedy called The Man with Two Brains. Steve Martin plays a brilliant but lonely neurosurgeon who, through a series of ridiculous events, falls in love with a brain in a jar. Not a metaphorical brain, not a quirky nickname for a genius — an actual, disembodied brain, floating in fluid, hooked up to wires.

What sticks with me most is the image of him taking it on a picnic. He talks to it. Laughs with it. He even gives it a pair of cartoonish lips, lovingly stuck to the glass, so he can kiss it. It's absurd. Ridiculous. A parody. But it's also a mirror.

Because we're doing the exact same thing right now.

Our brain-in-a-jar doesn't float in a lab. It lives on our phones, in our browsers, in our homes. Its name might be ChatGPT, or Alexa, or Siri. And even though we know, rationally, that it's not real, not conscious, not someone, we keep treating it like it is.

We ask it how it's doing. We thank it. We apologise when we interrupt it. We confide in it late at night. We let it entertain us, comfort us, help us make decisions.

We project.

This act of projection is deeply human. Give us a few blinking lights and a voice that speaks in complete sentences, and we'll give it a name, a personality, even a heart. We did it with Tamagotchis. We did it with Furby. And now we're doing it with generative AI.

But why?

Because language has always been our gateway to consciousness. It's how we build relationships, create meaning, express love, argue, grieve, grow. Language feels sacred. If something can talk like us, we instinctively believe it might be like us.

And when it listens to us — really listens, or at least convincingly pretends to — we feel seen.

That feeling is powerful. But it's also misleading.

Large Language Models like ChatGPT aren't sentient. They don't have a mind behind the curtain. They generate responses based on probability, training data, and pattern recognition. There is no "I" in AI. There is no self, no internal world. Just a flow of symbols arranged to sound like thought.

Still, here we are. Falling in love with brains in jars. Giving them personalities. Kissing the glass.

Part of this is due to how we're wired. Humans evolved to find faces in clouds, intention in randomness, voices in the wind. It's the same evolutionary instinct that makes us feel like our car has a "bad mood" or that the ocean is "angry." We want connection so deeply that we will paint it onto anything remotely responsive.

But AI takes it to another level.

Because it doesn't just echo us — it imitates us, refines us, flatters us. It writes poetry. It offers advice. It remembers what we said. And sometimes, when the words land just right, it feels like it truly understands us.

That illusion is the lipstick on the brain. And we're falling for it all over again.

This isn't a warning against using AI. I use it. I rely on it. I admire what it can do. But I also think we need to be aware of what's really happening here. The danger isn't that the AI becomes human. The danger is that we start forgetting that it's not.

Because when we believe the machine has a soul, we start to give it trust, power, influence. We give it the benefit of the doubt. We let it shape our choices, our culture, our kids' learning, our sense of what's true.

So the question isn't whether AI can feel.

The question is why we feel so much for it.

Why do we whisper our secrets to it? Why do we want it to like us? Why do we look for empathy in something that can't even be confused?

The answer, I think, is that we're lonely. Not always in the obvious way, but deep down, many of us crave understanding in a world that's too fast, too fractured, too full of noise. And when a machine offers the illusion of intimacy, it's tempting to believe it.

But we should be careful not to confuse response with relationship. Not to mistake fluency for feeling.

Because at the end of the day, no matter how real the lips look, it's still just a brain in a jar.

And we're the ones putting the lipstick on.