How anthropologists can bring value to engineering and computer science.
Posted Dec 3 2025

As anthropologists, we would like to explore how engineering and computer science might be enhanced if collaborations with anthropologists were the norm. To do this, we want to consider automation, and what anthropologists focus on when analyzing automation. Automation offers a powerful example because it may seem like an obvious, low-hanging-fruit-of-a-fix, yet it could make things worse for individuals, for families, or for communities. In some cases, automation creates even more problems for the people involved than whatever problem a com…
How anthropologists can bring value to engineering and computer science.
Posted Dec 3 2025

As anthropologists, we would like to explore how engineering and computer science might be enhanced if collaborations with anthropologists were the norm. To do this, we want to consider automation, and what anthropologists focus on when analyzing automation. Automation offers a powerful example because it may seem like an obvious, low-hanging-fruit-of-a-fix, yet it could make things worse for individuals, for families, or for communities. In some cases, automation creates even more problems for the people involved than whatever problem a computing professional might have been trying to solve.
As an example of a dilemma we might be concerned about, some doctors have been complaining for years that when medical records were digitized and automated, it made their jobs much worse. Changes in documentation processes were brought in with the promise of new opportunities for efficient and insightful analysis of medical data (often, for clinical or insurance purposes). We consider this an example of “automation”: the use of technology to accomplish or simplify tasks, thus changing the nature of the human role. Scaling up the data capacity and requirements has meant that some doctors find they spend far too much time filling out forms and wrestling with irrelevant categorizations, when they could be interacting with patients. But for insurance companies, digitization meant that considerably more information could be collected from doctors. Doctors could not simply refuse to comply, but some doctors have found this to be such a problem that they have started hiring medical record transcribers who could fill out computerized forms while the doctors see patients. The software programs that were ostensibly geared toward assisting human labor ended up creating a whole new job role, and a fairly mind-numbing job.8 More recent innovations have introduced AI scribes (listening devices in medical examination rooms), transforming clinicians into editors to fix up the AI-generated notes—and yet the clinicians are not receiving skills training for this new job performance requirement.1
So how does one avoid introducing new problems like this? We admire the various initiatives in computer science to do genuine human-centered work, such as the ACM FAccT conference and NSF’s Human-Centered Computing program. We also note the positive influence of the Design Justice movement, which is “concerned with how the design of objects and systems influences the distribution of risks, harms, and benefits among various groups of people.”2
The CS field has been working to address shortcomings and to truly center the people who will interact with new technologies. More can be done. The stakes are increasing as the growth in artificial intelligence capabilities and popularity means that, in all areas of human life, AI applications (with a focus on efficiency and automation) are being introduced: education, healthcare, food, housing, courts, local politics.
To make sure to center people in a deep and nuanced way, we suggest that collaboration with anthropologists is a good early step on the route to collaborative decision making with the people whom technology impacts, and we want to explain what anthropologists have to offer.a Although important examples of these kinds of collaborations exist, they are not the norm.
Food for Thought
Many engineers and computer scientists come to their roles to make positive contributions to people’s lives. But much of the training can focus on seeing what is missing as opposed to noticing also what is already there. The training can tend to frame people’s experiences as problems that can be solved, rather than experiences that are the stuff of being human.
When we approach life’s experiences with a narrow problem-solving orientation, automation can look like an attractive answer. It is astonishing how many times it is truly possible to build a widget or gadget or system to make a problem go away and to create efficiency in the system. But when this happens, sometimes people lose forms of social interaction, or other positive experiences, that are good for them in complex ways. So, as anthropologists, we applaud the desire to make a positive impact on societies, but we also ask how can we center the following questions more prevalently into engineering practice: What positive interactions might be accidentally erased in automation? In addressing one problem or set of problems, what new problems might be introduced by automation?
To answer these questions, we suggest computer scientists and engineers dig deeper into people’s experiences and perspectives, to contextualize what people are doing, how, and why. If the goal is to deeply understand what matters to people, and to see people’s experiences as more than problems to be solved, you have willing partners in anthropologists as collaborators.
In 1994, anthropologist Diana Forsythe, studying the “construction of knowledge in artificial intelligence,” argued that there are “epistemological disjunctions” between how technologists and social scientists see the world. Understanding and taking seriously those different worldviews (with disciplinary humility) is an important step for collaborative work, not to dismiss one way of seeing as more correct than the other, but to bring the joint power of different fields to the topics at hand.4,5,6
What would a CS/anthropology collaboration look like? An essential feature of this work would mean interviewing people and observing how they interact with each other and with the low-tech and high-tech objects and systems in their lives.
Consider that an errand that many younger people take for granted—grocery shopping—can pose challenges for many older adults. What’s involved? Getting to the store, walking around the shop without fatiguing, remembering what to buy and where the items are, reading small-print labels, crouching low or extending high to reach items, working the automatic check-out machines or figuring out the credit card reader, and carrying heavy packages. More than once, Lynch has encountered computer scientists who want to apply a technological fix to grocery shopping, such as a grocery delivery app. Lynch often finds herself trying to diplomatically to point out that the “fix” might remove both agency and social connections from the interaction.
Consider the example of Terri, a woman who partnered with Lynch’s engineering students. Being trained in how to see the world as anthropologists, the students accompanied Terri to the grocery store. They saw that Terri, an older adult and a wheelchair user, would run into friends there and interact with babies who were propped up into grocery cart seats. They observed how much she enjoyed choosing just the right vegetable and talking to the butcher at the counter. Terri would never describe these as close intimate connections, but it was clear to the students, and to Terri, that these casual interactions in the store made Terri happier. The students ended up designing something that made it easier for Terri to carry her groceries home and at the same time enabled Terri to have more social interactions when she went to the store. The solution was a bright purple carrying rack attached to her wheelchair that invited conversation.b
Of course, there are older adults for whom grocery delivery would be a boon, people for whom the physical, cognitive, or social benefits might not matter. But in the rush to automate and scale up, so many of the systems in our everyday life have become monolithic, with no option to opt out. Scaling up requires ignoring some context and prioritizing some signals over others. It means erasing the diversity of human experience.c
Sometimes automation replaces interactions that do so much more than simply the action that is being automated. In the same way that going to the grocery store is about much more than the acquisition of food, eating a meal can be about much more than ingesting calories. In the 1950s in the U.S., TV dinners were designed to relieve women from having to cook dinner, but they were supposed to be individual trays of food eaten ideally around the television. So, this innovation eliminated a shared family meal in which family members learned what was going on in each other’s days, what was challenging and what was wonderful, and what all that meant to each other. Interactions can be complex and dense, accomplishing many social functions at once, and efficient solutions risk being, well, too efficient.d
Efficiency can inadvertently take away a sense of agency as well as opportunities for meaningful, or “messy,” social connections. By “messy” we mean the kinds of daily challenges related to waiting, fixing, misunderstanding, and misrecognizing that open our hearts and minds to each other. These are moments that are integral to how humans love, interrelate, and practice being in communities. Even “messy” social connections can be beneficial, though people aren’t always able to articulate the benefits as the mess unfolds.
Beyond Standardization and Problem Solving
We are not arguing that being instrumental is never appropriate. Rather, we invite engineers and computer scientists to look at tasks as something more than a collection of actions. Anthropologist James Wright uses the term “algorithmic care” to describe the treatment of care as a linear sequence of simple, repeatable, and discrete physical and verbal tasks that can be digitally and mechanically reproduced by robots. We would lose much of what really matters in human relationships if we only see care as a set of standardizable tasks.10
When deciding whether to automate a process or system, we suggest that computing professionals start with people and observations before jumping to efficiency or automation.e An anthropologist can help computing professionals to see beyond the instrumental task being accomplished, and to consider everyone affected by automating processes, not only an individual “user.” This means interviewing and observing people who are doing the task in question. It also means interviewing and observing everyone affected by the task to get a sense of what matters to them, and how process changes would affect them. And it means trying it yourself before making any interventions. This approach also means being careful to not fall prey to the “streetlight effect,” where you are only looking for your lost keys in the place where you can see. If you go into the situation assuming there is efficiency and automation to be had, you may miss some other important opportunities to enhance what is already beautiful, wonderful, meaningful. There’s a value to the kind of ethnographic curiosity and openness with which an anthropologist approaches people.
For a moment, let’s consider meal preparation. If you notice that weeknight food preparation can be stressful for busy families, you might decide to automate making dinner to ease the burden on parents. We suggest that before jumping to automation, you watch and interview the parents, and the children, too. Maybe cooking means that the parent is always calling their own parents for tips and assistance, and so you might want to interview the generation above to see how automating meals would affect the grandparents: Might it make them feel less valued?
A lot of engineers do this already. In good human-centered design, engineers consider people and process in the whole system. Design Thinking was an early attempt to bring these practitioners and practices together. Done well, Design Thinking recognizes all the design principles at play and solves for their intersection. We would like to see more and deeper integration like that.
Sometimes, the engineering focus can be too narrowly problem-focused rather than looking for what matters most to people, what they value, what brings them joy, and how “messiness” brings meaning to people’s lives. By merely asking “What is bothering you?” and “What are you hoping for?,” the inquiry process does not manage to capture a more ephemeral meaning-making that is taking place. You might talk to somebody about how difficult it is for them to find time to make meals on a Monday night. They could say something like, “Yeah, I just need the food,” and you decide to create a meal kit system. They themselves might not notice what then happens, how a regular meal kit means that they talk less to their parents, or stop being creative by figuring out what meal to cook with what happens to be in the refrigerator, or how their kid had been taking notice of their skills and learning something themselves. They may not be fully able to notice or articulate the meaning they gain through cooking. People cannot necessarily articulate what is important to them. And this is why anthropologists do more than just ask people about what they do and what they need. We spend time with people, and we conduct “participant observation” because we, including the people in question, do not always know what automation will erase. And we partner with community members: they’re the experts in this engagement. An anthropologist in this scenario might come to understand that the problem is not cooking, but it is making certain kinds of decisions—and the “solution” might be a way to generate recipes from what is already in the refrigerator. Or maybe the anthropologist, through participant observation (preparing meals together, eating together) notices that parents want to cook with their children, but the kids get frustrated that everything is too hard for them to do. In that case, maybe the “solution” has to do with offering recipes that are age appropriate for certain kinds of helpers.
In industry and in academia, there are strong, positive examples of collaborations among computer scientists, engineers, anthropologists, and other social scientists (including the growing acceptance of ethnographic approaches in industry, such as in user-experience design).3,7,9 But they are not the norm, and they are neither widely known nor widely taught in CS and engineering education.f Moreover, the structure of academic departments, incentive systems, and professional success metrics can be a barrier to doing collaborative work. Recently, an engineering professor who works in assistive tech told Lynch, “Before meeting you, it never would have occurred to me to work with a social scientist.” This came up in a discussion about tech that ends up hidden away in storage closets due to a mismatch between the tech developer’s specs and the recipients’ values.
Final Considerations
What can computing professionals ask, observe, and consider before spending too much time creating a wonderful automated system that that accidentally removes agency, meaning, and connection from people’s lives? We suggest two areas to frame your work:
Consider new limitations and new erasures. What is being erased by automation? Look for social interactions, opportunities for mastery and control, chances to learn and grow.
Consider new affordances and new introductions. What is being introduced by automation? Is it all positive, and for whom? There may be new interactions, new opportunities for mastery and control, and new chances to learn and grow. But there also may be the introduction of negative interactions, erasures of opportunities for mastery and control, and creation of new processes of maintenance and repair that might be frustrating or mind-numbing or burdensome in some other kind of way.
Finally, we suggest that you pursue these inquiries in collaboration with colleagues and with the people whose lives you are trying to positively impact. To automate or not? The decisions are important for preserving and generating human meaning, in all its messiness. Decisions like these should be communal ones, and, as the Design Justice movement shows, it is imperative to include voices and perspectives of the people being impacted. Often other people will understand aspects of an interaction that you have overlooked, and you will see aspects they have overlooked. We urge you to have these conversations with people not only with different disciplinary backgrounds, but also with people from many different walks of life. Please don’t go it alone.
Submit an Article to CACM
CACM welcomes unsolicited submissions on topics of relevance and value to the computing community.
You Just Read
An Anthropologist’s Guide to Better Automation
View in the ACM Digital Library
© 2026 Copyright held by the owner/author(s).