What are emotions and how design is connected
Emotions are complex. They are not feelings nor are they desires. I’ll define emotions as a biopsychological process that happens inside the body and is an information-processing tool. I heard emotions being opposed to rationality—by some coincidence, pretty often in a sexist logic. But it’s quite the opposite, and emotions matter in effective decision-making. The way interactive interfaces, data visualizations, and other design systems that surround us are constructed may influence our emotional experience and processing. The ability to meaningfully experience data visualizations through emotional feedback enhances [engagement](https://eprints.whiterose.ac.uk/id/eprint/106567/3/KENNE…
What are emotions and how design is connected
Emotions are complex. They are not feelings nor are they desires. I’ll define emotions as a biopsychological process that happens inside the body and is an information-processing tool. I heard emotions being opposed to rationality—by some coincidence, pretty often in a sexist logic. But it’s quite the opposite, and emotions matter in effective decision-making. The way interactive interfaces, data visualizations, and other design systems that surround us are constructed may influence our emotional experience and processing. The ability to meaningfully experience data visualizations through emotional feedback enhances engagement.
In a design context, how we reflect on our emotional experiences can vary depending on the system architecture. That matters because we are surrounded by interactive systems, from AI-based digital products to newsroom data visualizations and train ticket machines. This is why the framework for constructing design systems that fascinates me is affective computing, a discipline researching how emotions can be detected and responded to by interactive systems. Currently, techniques such as emotion recognition via audio, speech and physiological data as well as sentiment analysis of textual evaluations and opinion mining are used to get this information. But is this factual data enough to effectively and empathetically evaluate the meaning of communication? Boehner and colleagues wrote that there are a lot of caveats to interpreting emotions, such as limitations of the given evaluation method. This means that the methods of factual evaluation should be combined with cultural understanding and nuanced assessment.
Emotions are constructed not only within physiological, but also cultural and social contexts. Emotion manifestation must make sense within the cultural context in which we live, and it matters what our own reaction to our emotions is. Am I ashamed to be openly angry, or rather feel justified? (Although not emotion, I keep thinking about a North Korean refugee explaining that there is no concept of “depression” in North Korea. Then, when one has depression in North Korea, how is the interaction with this state constructed and articulated?) Dr. Rosalind Pickard, a creator of affective computing, said that “Originally, affective computing was an area of research created to give technology the skills of emotional intelligence. The goal is to create technology that shows people respect, such as by not continuing to do things that cause people to become frustrated or annoyed.” She also says, “There is no magic sensor that will accurately convey how someone is feeling. We need to combine AI learning with lots of information from multiple channels gathered over time to even make a guess at feelings.”
Two models
Boehner and colleagues proposed a classification of affective computing, depending on what the goal is: informative and interactive models.
The Informative model is based on the idea that emotions can be classified, symbolically encoded, categorized, and transmitted. The success metric for such a system is whether the emotion I sent to another user is interpreted correctly.
The interactive model is based on the idea that emotions are constructed in a process of interaction. The goal of the system in this framework is to provide a place to reflect on emotion. If the system helped the user interpret, understand, and reflect on their emotional state, it is a metric of system success.
Scheme by the author
How models work
I find the comparison between informative and interactive models important today, because in the world of deceptive patterns, misguiding charts and issues related to the AI that maximizes engagement over safety, it is important to design systems that can not only comprehend emotions, but also give a safe space for reflection and empathy without pushing the limits.
How would an information model be different from an interaction model? Let’s say I have a conversation with my childhood friend. Two scenarios:
- We use instant messaging, emoji exchange. Transmission of emotion is limited by the range of animations / character settings (Still, it can be fun). My sadness becomes a static crying cat, and my friend’s irony becomes an animated character resembling her, but with orange hair. This is an information model.
- We use email agent EmoteMail (old and dead, but interesting nevertheless). EmoteMail took photos of the user’s face when she was writing an email. The photo was automatically placed approximately near the paragraph that was written when the photo was taken. Moreover, paragraphs were color-encoded to reflect the time duration of writing. (Happy this is not my compulsory work email agent. But the conversation with a friend, long-distance flirt or quarrel could be, perhaps, an interesting experience). This is an interaction model.
We have two totally different conversation spaces. The system can initiate analysis of the interaction result (information model, emoji) or catalyze interpretations of interaction (interaction model, Emotemail). Unlike emoji-type apps, EmoteMail-type apps remove the predetermined classifications, providing more direct access to the emotion of another person and an instrument for interpretation via data collection and representation. Predetermined classifications may limit the overlay of cultural and situational context (Have you ever had a desire to react to someone’s Instagram story, but pressing the heart, fire, or clapping hands was deeply contextually inappropriate?) In EmoteMail, the context of two people enhances the meaning of interaction, and the visualization of behavior provides clues to a person’s emotional state, but it doesn’t provide answers. It allows users to draw their own conclusions, providing a framework for data collection as a playground for interaction. However, the design system might have been pushed too far. In a forum discussing the project, a user Adam Kazwell wrote: “the thing that scares me about EmoteMail is having the recipient see what didn’t show up in the final draft. What value is added knowing that I misspelled recipient 3 times before I posted this comment? (…) If you want more than just cold-hard text, maybe pick up a phone or meet face-to-face :)” It was in 2004, and in post-COVID 2025, full of cold-hard texts and videoconferencing not being a perfect substitute for face-to-face communication, perhaps it is a good time to build design systems using both interactive and information models.
Not only may we not want to overshare in design systems, as Adam Kazwell mentioned, but sometimes we need time to process and understand the emotion we are experiencing. That’s why I don’t talk to AI about my emotions—I don’t want any priming and forcing. I need to get there myself. And with the latest Congress hearing Examining The Harm Of Chatbots on the tragic deaths of teenagers who interacted with AI agents, the question of the safety of technologies that can mimic empathy as confidants is pressing. And perhaps an interactive model of computing can provide ideas on how to construct safer systems and balance a widely used information model that may understand emotions, but drives them for engagement.
An example of an interactive model in data visualization that gives space for reflection on emotions is the Tied Knots project, which tells stories of harassment in academia. It provides users with a space to reflect on emotions and lived or observed experiences, and fosters a sense of community without pushing users to any particular conclusion or emotion. It prompts to assess the situation and maybe even make some personal decisions. Another example is Affective Diary, a data visualization project that empowered participants to track their emotional experiences via guided questions and sensor-tracking of arousal and movement – something that can be found in health trackers like Oura. However, researchers found that using graphs was not the best way to connect with emotional experiences, and proposed that data visualization for empathy should look “familiar”.
The way social media platforms mine data about users is related more to the informational model, with the intention of correctly understanding, predicting, and profiting from the user’s emotions. On the positive side, health tracking apps and devices also utilize an informational model by collecting physiological data to assess the physical and emotional state of the user, which, with ethical data collection, can promote well-being and improve health. A good example of an informative model that provides such reflective space without pushing boundaries is the app How We Feel, which helps users understand their emotional state by offering hundreds of emotions to choose from, each with its own classification. At the end of the week, the user receives a data visualization of emotion distribution, as well as tools to manage them.
How We Feel new app feature. Image provided by the author.
What could be done better
I feel that the informative model, although useful and important, is overexhausted by the goal of marketing, when the interactive-based model is a humane framework of human-computer co-existence that might be important to add in a current approach to business and metrics. So many daily interactions with design systems (especially scaled to serve a lot of users) can handle big data, but lack human touch and compassion.
I love the idea that a design system doesn’t get to know my emotions to analyze me (I don’t like you, Facebook), but instead gives me a space to make sense of my emotions. The question is: how can we sustain such design at scale, systemically?
There are promising examples of empathy integration and meaningful interaction into the business model. For example, a Deep Viewpoints application developed for the Irish Museum of Modern Art provides visitors with a digital platform to share the emotions they feel when interacting with an artwork. Through mediation, users can also share reflections and questions in the form of a digital script, allowing others to access and utilize it for their own reflective and interpretive experiences. Such app allows museums to better understand their communities, and for minoritized communities to have a participatory space in a cultural dialogue. It allows people to be active participants when theyinteract with the cultural heritage, an important engagement practice.
Another example of a sensory experience that involves a space for reflection is a study at Blair Drummond Safari and Adventure Park. Researchers built a multi-sensory device that allowed red lemurs and visitors to interact through smell, sound and videos. Researchers found that not only did people stay for longer times, but also that it increased their empathy towards animals and increased their educational outcomes.
“We allowed people to share in the same experience to try and get people to have a sense of understanding of other, that we are sniffing together, helps makes animals more relatable and understandable.” said Ilyena Hirskyj-Douglas, an director of the Animal-Computer Interaction Lab who led the project, “As people our impact on animals and the planet is far reaching and I hope that this empathy can shape how people think and behaviour towards animal conservation. Though it is really unknown what the lemur in this case thinks. Some zoo keepers think of animal-zoo visitor interaction it as a type of enviromental enrichment.”
Perhaps if empathy is not an engagement extraction resource, but a space to build both connection and business, our design and social systems both benefit.

Daria Koshkina
Daria is on the Information Is Beautiful Awards 2024 Longlist. She is 2025 the Young Ones Jury and 2024 ONE Screen Jury. Her works were showcased at the London Design Festival, The Institute for Art and Innovation and the New Alliance Gallery. She contributed to Barabasi Lab projects shown at the Architectural Venice Biennale, Ludwig Museum, MEET Digital Culture Center, and Postmasters Gallery. Her editorial and scientific illustrations were published by Vice, UN Women, Journal of Neuroscience and Science.
Related entries