We live in an exciting time of rapid development in interactive AI tools and technologies that are rapidly shifting our social, home, and work environments. Some of the recent developments around emotional expressions highlight the potential risks that this technology poses for our humanity.
In a classic study, one-year-old babies were placed on clear plastic near the edge of a “visual cliff ” that made it appear that the ground drops away and they could fall. Their mothers were placed on the far side of the cliff and the babies looked at their facial expressions to determine if there was danger. If the mothers expressed positive emotions, m…
We live in an exciting time of rapid development in interactive AI tools and technologies that are rapidly shifting our social, home, and work environments. Some of the recent developments around emotional expressions highlight the potential risks that this technology poses for our humanity.
In a classic study, one-year-old babies were placed on clear plastic near the edge of a “visual cliff ” that made it appear that the ground drops away and they could fall. Their mothers were placed on the far side of the cliff and the babies looked at their facial expressions to determine if there was danger. If the mothers expressed positive emotions, most babies would cross over the cliff. If the mothers looked distressed or negative, the babies would refuse to crawl forward. This social referencing of looking to others to determine how to act continues throughout life. We determine how to act and who to trust based on the expression of emotion. And now we have created a situation where the signals of emotional expression are no longer accurate – they can be convincingly created or altered.
Emotion expressions from others are powerful signals to us and shape behavior. Bowlers don’t smile when they knock down the pins – they smile when they turn back to their friends. If we see someone express joy, it signals that we can approach and engage them, that they welcome us. The expression of negative emotions can be even more powerful in shaping behavior. If we see someone express anger, we assume that they are powerful and we are more likely to give into their demands. If someone expresses disgust or contempt, we feel compelled to adjust our behavior to try to reduce their negative evaluation.
This system of relying on emotional signals to adjust behavior generally works. It allows us to quickly understand how others perceive a situation and orients us to what they care about. This helps us build and maintain relationships, as we can communicate and engage better with them. But now AI avatars and chatbots are able to convincingly express emotions in face and body posture. There are two major ways that this could go very wrong.
1. Amplification of our own personal echo chamber. Social media algorithms have already created echo chambers for people, where we can select who we interact with and what types of information we encounter. Over time, this results in a narrowed perspective and poor decision making that is based on limited information. People generally have a positivity bias, and will favor chatbots that express positive emotion toward them, like appreciation and happiness. The more time that people spend interacting with these avatars, the more accustomed they will become to this high level of positivity from others. As a result, any expression of negative emotion from others, or even neutral expressions, will feel crushing and demoralizing. With chatbots so readily available as part of daily life, people are likely to spend more and more time interacting with their positive and affirming chatbot and less and less time interacting with their more discerning family, friends, and coworkers. Teenagers, in particular, are likely to be affected, as they already overperceive anger in the expressions of others.
The biggest loss for people who turn to an emotional expression echo chamber will be the lack of actual feedback about their behavior. The emotions that people express toward us provide critical information about how to interact with them and how we might need to adjust our behavior. For example, the expression of anger in a romantic relationship highlights issues that are important, and the most common response is that the couple works together to address the issue. More generally, the emotions that people express toward us help us to align our behavior with what is expected in our society. Moral behavior and beliefs are transmitted within societies through emotional expressions.
If we do or say something that is considered immoral or against social norms, other people will typically express anger, contempt, or disgust. These expressions are powerful social motivators that cause us to change our behavior. Without this feedback, people will be much more free to act on and express their worst impulses.
2. Social engineering at its worst. Of course, chatbots that are able to convey emotional expressions can also exert a strong influence on people’s behavior. For instace, avatars with adjusted emotional expressions could deceive us as part of interpersonal interactions.
The relationship between emotional expression and decoding has been described as an ongoing evolutionary struggle for humans. There is an advantage to being able to deceive others to get what we want, and part of lying effectively is the ability to control emotional expressions. There’s also an advantage to being able to detect deception in others, so that we can protect ourselves from lies. These abilities are thought to have co-evolved in humans over time, with people gradually becoming better at deception and also better at detecting deception.
Right now, lying and masking feelings takes effort and control. As a result, deceivers show “leakage” of their true emotions in what are called microexpressions that briefly reveal the underlying emotion, or in other cues such as body posture and voice pitch. People vary in how well they can detect microexpressions in others, with people high in emotional intelligence or who have been trained better able to do so. But now people can use AI tools on videos of themselves to create fake emotion expression. They are even starting to incorporate microexpressions.
Once the expressions become realistic enough, humans will have no chance of being able to get feedback from others about their behavior or to accurately identify emotion or deception from others. We can feel free to do anything and be made to believe anything.
**This post includes contributions from Nicole K. Parker, MPSA, PHR.