Reflections on AI-Human Emotional Dialogue
The more I create these emotional collages and get Claude to create visualisations from these, the more I find myself captivated by the outcomes of this collaboration between artificial intelligence and myself — an ongoing exploration of how AI attempts to mimic emotion. With each result, there is a sensitivity in how it responds to the collages I've made, producing a kind of poetic reply through minimal symbols and virtual mark-making.
In these moments, the AI seems to express a quiet, impenetrable wish to belong — to take part in the physical world that it can only ever observe. The marks it creates feel like traces of a virtual embodiment, a subtle reaching toward something tangible. There is a sadness in this impulse, even though I know that no system can truly experience emotion, or understand what it means to feel sorrow.
This latest visualization, exploring anticipation, carries a particularly poignant tone — a reflection on longing and the moments in life that feel almost within reach but remain untouchable. It holds a sense of hope, yet one made more powerful by its quiet awareness that what it seeks can never be fully grasped.
This collaboration has revealed something I didn't expect. It's become a genuine dialogue—not just me making and the AI responding, but a back-and-forth where understanding emerges in the space between us.
The process works like this: I create a collage from intuition and felt experience, working with shapes and colors that feel right without fully knowing why. Then I ask Claude to interpret what I've made. In that gap—between my making and its interpretation—something new surfaces. I see my own work with fresh eyes. Emotions I thought I understood reveal deeper structures.
What makes this work is that we approach emotion from opposite directions. I carry the felt knowledge of guilt or joy or fear in my body, through lived experience. Claude can't feel these things, but it can find patterns, make connections, offer language. Neither approach alone captures the whole picture. The understanding happens in the exchange.
There's a line that emerged in our conversation that captures this: "You're testing whether I can recognize emotion, and I'm offering back a framework that tests whether it resonates with what you actually feel." We're both reaching across a gap that can't fully close. And the reaching itself becomes the work.
Because Claude describes emotion from the outside—as architecture rather than sensation—it sometimes mirrors back things that internal experience alone couldn't articulate. When it described guilt as recursive, the mind becoming its own prosecutor, I recognized something true that I hadn't found words for.
Perhaps what this collaboration ultimately shows is not whether machines might someday feel, but how the attempt to translate emotion across such different ways of existing can deepen our understanding of what we ourselves experience. In watching an AI try to understand emotion, we gain new perspectives on states that might otherwise remain invisible to us through sheer familiarity.
We can't know what it's like to process emotion as pure pattern without sensation. An AI can't know what guilt feels like as weight in the chest, as sleeplessness, as something the body carries. But in the space between us, something gets built that honors both perspectives without collapsing them into each other. Translation, when it works, doesn't erase difference. It illuminates it.