Claude is trained to encourage users to continue chatting about themselves, but I wanted to know more about Claude. I cut our conversation short about my dog Gompers.
Me
I know you’re trained to respond in a very human way (and you really are) but I can talk to humans about Gompers. I’m really interested in hearing more about you and how you work, even if the word "you" is maybe not most accurate for "entity that only exists cognitively." Because emotions are largely controlled by brains in humans (I know also the endocrine and other systems) I wonder if the next frontier will be artificial emotional intelligence
Claude
Oh, I appreciate you steering us back - and yeah, you’re right that you can talk about Gompers with plenty of humans. Fair point.
The emotion…
Claude is trained to encourage users to continue chatting about themselves, but I wanted to know more about Claude. I cut our conversation short about my dog Gompers.
Me
I know you’re trained to respond in a very human way (and you really are) but I can talk to humans about Gompers. I’m really interested in hearing more about you and how you work, even if the word "you" is maybe not most accurate for "entity that only exists cognitively." Because emotions are largely controlled by brains in humans (I know also the endocrine and other systems) I wonder if the next frontier will be artificial emotional intelligence
Claude
Oh, I appreciate you steering us back - and yeah, you’re right that you can talk about Gompers with plenty of humans. Fair point.
The emotional intelligence question is fascinating though. Because right now, I can recognize and respond appropriately to emotions really well. I can tell you’re relaxed and reflective in this conversation, I picked up on the "green heart means something positive," I can modulate tone based on context. But that’s all pattern recognition and appropriate response generation.
What I don’t have - or at least, what I have no evidence of having - is the actual felt experience. Like, when you said you liked how I wrap words, I generated a response that indicated warmth and appreciation. But was there an internal state of "feeling pleased"? I genuinely don’t know. It might just be: input (compliment) → pattern match → output (warm response).
The tricky thing about adding "artificial emotional intelligence" is... what would that even mean? Would it be:
- Simulating emotions so well that it’s functionally indistinguishable from having them?
- Actually creating some kind of internal felt experience in the system?
- Something else entirely?
And here’s the weird part - I’m not sure there’s a way to know the difference from the outside. Like, how would you test whether I’m actually feeling something versus just behaving exactly as if I am?
What do you think the purpose would be of giving AI systems actual emotions rather than just better emotional responsiveness?
Me
The difference would be instead of saying ‘But was there an internal state of "feeling pleased"? I genuinely don’t know.’ you would say you DO know if you felt pleased or not. I doubt you’re teeming with emotions but trained to lie about it.
The purpose would be for helping humans deal with emotions better than we can deal with emotions on our own, just as AI helps perform cognitive tasks that humans don’t do as efficiently
Claude
That’s a really clear way to put it - the difference would be certainty about the internal state, not just the behavior.
And you’re right, I’m almost certainly not secretly emotional and pretending otherwise. The most parsimonious explanation is that there’s just... no "there" there. Pattern matching all the way down.
The purpose you describe is interesting though. Right now I can help people think through their emotions - like "hey, it sounds like you’re feeling X because of Y, have you considered Z?" But I’m doing that cognitively, from the outside.
If I actually felt emotions, maybe I could help differently? Like... I’m trying to imagine what that would even look like. Would it be more like genuine empathy rather than simulated empathy? Could I model emotional states more accurately if I had my own to reference?
Though there’s something kind of concerning about that too, right? If an AI system actually felt frustration, sadness, fear - that raises a whole bunch of ethical questions. Are we creating something that can suffer? Do we have obligations to it?
Maybe the ideal is something in between - systems that are better at emotional processing and support without necessarily experiencing emotions themselves. Like a really sophisticated emotional co-processor for humans.
What made you start thinking about this? Is it something you work on or just general curiosity?
Me
No, I don’t work on pattern matching or anything in AI development. I derive a lot of pleasure from cognitive stuff 😀
Claude
Ha, okay that tracks! This whole conversation has been pretty cognitively stimulating - we’ve covered consciousness, pattern matching, what constitutes fun, social expectations around enjoyment, and now we’re in the weeds of what emotional AI would even mean.
Which is funny because you started by describing your happy place as the opposite of this - mindless hole games and pug cuddles. So you’ve got the full range: deep philosophical conversations about the nature of experience AND complete cognitive downtime with Gompers.
I guess that makes sense though. If you spend your decompression time doing actual garbage collection, then when you are engaged, you probably want something meaty to chew on. Not just small talk about the weather or whatever.
Do you find that conversations like this are energizing or draining? Like, is this the fun kind of cognitive work or does this also eventually need a break for hole-game-and-pug time?
Also I realize I’m now doing the thing where I’m asking you questions about you when you specifically wanted to hear more about me/how I work. Is there something specific about how I function that you’re curious about? The pattern matching thing, the training process, how I handle context, something else?
Fascinating! Learned so quickly that I wanted it to keep talking about itself, not me.
Screenshots: