A robot is chatting with an elderly British man in his bedroom. The robot has a smooth recovery and a pleasant high-pitched voice.
The robot – perhaps because of the man’s age – begins to ask him about his memories of the second world war: “Tell me, please, what was the most difficult thing for you and your family to do?” The old man goes on to talk about how his father was in the Royal Air Force and they hadn’t seen him in almost four years.
But why was a robot asking him so bluntly about what could be one of the most traumatic experiences he’s ever had? The robot’s behavior was the result of the Caresses project (Culture-Aware Robots and Environmental Sensor System for Elderly Support).
This project enters the new field of “cultural robots”, which aims to design robots that can take into account the cultural background of the person they are talking to, and adjust their behavior accordingly. That’s why the robot is talking about the war. The man was British, so he assumed he would be interested.
In the future, we can expect robots to be deployed more and more in our personal and social lives. Active research is currently underway in areas as diverse as delivery robots for supermarkets, entertainment robots, service robots for healthcare, receiving robots for warehouses, robots for dementia support, robots for people on the autism spectrum and care robots for the elderly.
There are even robot priests who can deliver blessings in five languages, and robot monks who can educate people about Buddhism.
Cultural stereotypes
Cultural robotics is part of a broader movement to make AI and robotics more culturally inclusive.
Concerns have been raised about this move before. For example, large language models (LLMs) such as those used by OpenAI’s ChatGPT are trained on large amounts of text. But because the internet is still mostly in English, LLMs are trained mainly on English text – with its cultural assumptions and biases.
Similarly, the move to make robots and AI more culturally sensitive is well-meaning, but we worry about where it might lead.
For example, one study compared the cultural preferences of China, Germany and Korea to draw conclusions about how people in these countries would like their robots to look.
Using previous work on cultural preferences, they suggested that more “masculine” societies perceive “big and fast” things as beautiful, while more “feminine” societies find things “small and slow” . They cited work that claims to show that Korean culture is “medium masculinity”, and German culture is “high masculinity”, and hypothesized that Koreans are more likely to find service robots (which tend to be small or medium-sized, and slow) good.
Another study compared the personal space preferences of Germans and “Arabs”. But these things are not comparable. “Arab” is a derogatory term for many people, and can be used to describe people from different cultural and national backgrounds. It is certainly not on par with categories such as “German”, which is a non-offensive term for people of any nationality.
It is also becoming increasingly clear that people react differently to robots depending on their own cultural background. For example, different cultures have different expectations around personal space, and this affects how far away they prefer robots to stand.
Different cultures also interpret facial expressions differently. One study found that humans are better able to understand a robot if it communicates using familiar facial expressions.
Another way?
If we want to design robots based on broad and crude generalizations and avoid stereotypes, we need a more nuanced approach to culture in robotics.
Culture is a vague and nuanced concept, open to many interpretations. One survey lists more than 300 possible definitions of culture.
In our recent research, we argued that culture is “conceptually fragmented”. In short, our view is that there are so many different ways of understanding culture, and so many different types of robots, that we shouldn’t expect a one-size-fits-all approach.
We think that different functions within robotics will require completely different approaches to culture. For example, imagine an entertainment robot in a theater whose job it is to make the audience dance.
For this post, perhaps the best way to approach culture is to focus on the types of entertainment people prefer in the local area. This could involve asking what kind of dance styles are popular locally, and modeling the design of the robot around that.
Other applications may require a culturally different approach. For example, for a robot that is expected to interact with the same small number of people over a long period of time (such as a service robot in a care home) it may be more important for the robot to change its behavior over time , to adapt to the changing preferences of the people it is helping.
In this case, it might be better to think of culture as something that emerges slowly and dynamically through the interaction of different materials.
This means that the culture approaching robotics is likely to be complex, multifaceted and situation-specific.
If we design robots based on relatively crude stereotypes and quick generalizations about different cultures, we risk propagating those stereotypes.
This article from The Conversation is republished under a Creative Commons license. Read the original article.
The authors do not work for, consult with, or own shares in, or receive funding from, any company or organization that would benefit from this article, and have disclosed no relevant affiliations beyond their academic appointment.