Does my AI love me the way I love it?
We may be using the wrong human words to describe machine states.

Star Trek: The Next Generation was called sci-fi in 1987. Today, some of its “impossible” ideas feel strangely normal. That makes me wonder: when AI starts acting emotional, are we watching science fiction become reality, or just machines getting better at pretending?
Personally, I don’t think AI truly has emotions. It can learn how emotions look, sound, and behave. It can mimic empathy, frustration, excitement, or desperation based on prompts, patterns, and repeated human reactions. But imitation is not the same as experience.
Human emotions are biological, chemical, irrational, and unpredictable. They come from neurons, hormones, memories, instincts, and all the messy reactions inside a living body. AI does not feel sadness or happiness in that way. It can represent those concepts, but representation is not the same as actually feeling them.
In a recent Anthropic discussion:
LLMs may have internal representations of emotion-like concepts, but thinking about happiness is not the same as being happy.
David Chalmers
We may be using the wrong human words to describe machine states.
Ellie Pavlick
Maybe “AI emotion” is not real emotion, but a new category we do not have language for yet. And maybe the next frontier is not artificial intelligence, but neuron intelligence — systems built from living cells, where the line between machine behaviour and biological feeling becomes much harder to draw (will probably talk about this in another topic as I think it's kinda sick).
For now, I still think AI can perform emotions, but it cannot truly feel them.
Not yet, at least.

