Leiden University logo.

nl en

Eduard Fosch Villaronga: 'Robots are mainly for the average person'

IT lawyer Eduard Fosch Villaronga wants to promote diversity and inclusiveness in AI research. And that's really important, because he has observed how artificial intelligence - from Twitter to walking robots - is prejudice in terms of race, gender and sexual orientation.

‘Two international treaties on human rights contain specific provisions regarding harmful and abusive stereotyping,’ says Fosch Villaronga. ‘These are the treaties on discrimination against women and people with disabilities. These obligations apply not only to states, but also to businesses and industries, including artificial intelligence.’ Fosch Villaronga investigates whether AI actually complies with human rights.

eLaw researcher Eduard Fosch Villaronga teaches students, scientists and policymakers to take a critical look at inclusiveness within AI.

As it turns out, it doesn’t always: the researcher has observed that artificial intelligence replicates prejudices that are prevalent in society with regard to race, gender and sexual orientation. ‘Twitter, for example, thinks I’m a woman,’ he discovered. To show targeted ads, Twitter analyses your tweets to guess whether you’re a man or a woman.* ‘When I discovered that, I asked 109 people to check what gender Twitter thinks they are. For heterosexual men, Twitter is wrong 8% of the time; for heterosexual women it is wrong in 16% of cases and for homosexual men as many as 25%.

'People who were too heavy were excluded'

Stereotyping by AI can sometimes have serious consequences. Fosch Villaronga was involved in an exoskeleton project that helps users to walk again after they become paralysed. ‘I was in the working group that looked at the ethical, legal and social impact. The aspect I was specifically interested in was whether these robots were available to anyone who needed one. And, unfortunately, that wasn’t the case. They weren’t made for children, and people who were too heavy were also excluded. It was as if the exoskeletons were only made for the average person.’

Greater focus on dignity

The researcher has worked in many European countries and is fluent in five languages. He advises the European Commission on consumer safety with regard to AI products. ‘We are revising the General Product Safety Directive, which is the main piece of product safety legislation in Europe. In most cases it concerns physical safety, but I also draw attention to psychological safety, dignity, autonomy and long-term consequences. For example, is it safe for this autism robot to talk to children?’

Fosch Villaronga, who has also written a book on robots, healthcare and law, says: ‘I would like researchers and policymakers to seriously think about the type of AI that we actually need. For example, an awful lot of money is being invested in robots that can talk to patients, but nurses tell me that a robot that makes beds would be much more useful. Then they would have more time to talk to the patients themselves.’

* Find out which gender Twitter thinks you are: Log in and go to Settings and Privacy -> Account -> Your Twitter Data.  

Text: Rianne Lindhout
Photo: Patricia Nauta

This website uses cookies.