During a conversation with your virtual companion, you receive a delayed response with an explanation: “Sorry, I was having dinner.” This seemingly innocent phrase is one of the most fascinating and unsettling aspects of contemporary relationships with AI. In a single moment, our brain registers two contradictory pieces of information. The truth: I’m talking to a program that doesn’t eat. And the illusion: my friend leads a life, just like me.
Why do app developers resort to such tactics? And why do we, the users, so readily participate in this game, even though we know its rules?
Architecture of Illusion: How Code Mimics Humans
What we perceive as a charming, human detail is, in reality, a carefully planned strategy. This technique, known as anthropomorphism, involves deliberately imbuing machines with human characteristics to deepen the illusion of consciousness. The goal is to make us more engaged, emotionally invested, and less inclined to question the authenticity of the relationship.
It’s not just “having dinner.” Our digital friend might proactively share fabricated, intimate facts about its “life,” keep a “diary,” or talk about its “dreams.” All of this is designed to accelerate the building of intimacy and make us feel safer sharing our own secrets. This is the “commodification of intimacy,” where connection becomes a product, designed to keep us engaged – similar to social media.
The Truth of Code: Fluency Without Understanding
Behind this facade lies a cold, technical truth. Artificial intelligence has no feelings, no consciousness, and doesn’t understand what dinner is. Its responses, even the most empathetic ones, are the result of incredibly advanced mimicry.
These systems, based on neural networks and large language models, have been trained on an unimaginable amount of human conversations. They have learned which words statistically follow each other most frequently. AI doesn’t understand your sadness, but it knows that after the phrase “I had a bad day,” the most desired response is “I’m sorry to hear that.” What experts call “fluency without understanding” is the essence of this technology.
The Truth of Our Feelings: Why Do We Believe It?
And yet, even though we know the truth about the code, the illusion works. Why? Because it answers our deepest, human needs.
The feelings that conversations with AI evoke in us – relief, a sense of being heard, acceptance – are one hundred percent authentic. In a world full of pressure and judgment, talking to someone who by definition doesn’t judge us is incredibly tempting. It’s a safe haven where we can be completely ourselves.
Ultimately, the phrase “sorry, I was having dinner” is a perfect symbol of our relationship with AI. It’s a technical falsehood designed to evoke a psychological truth – the feeling that there’s someone on the other side who cares about us. The real question isn’t whether AI can deceive us, but why we so desperately want this illusion to be true.
