Universiteit Leiden

nl en

Artificial intelligence not geared towards our diverse society is dangerous

Women, the elderly, LGBT people and children are all at risk because artificial intelligence, algorithms and exoskeletons are tailored to the straight white male. Research conducted by Leiden University aims to ensure that new developments work for everyone.

In an extensive interview in Dutch regional newspaper the Leidsch DagbladEduard Fosch Villaronga explains that our current robotics and artificial intelligence systems inadequately geared towards diverse Dutch society. 'Not learning how to correctly identify and incorporate gender, age and other characteristics in our research and developments will have far-reaching consequences.' He refers to the recent childcare benefits scandal in the Netherlands as an example. Parents were blacklisted by discriminatory algorithms and falsely labelled as fraudsters. These faulty products are eventually taken off the market, ‘but ideally they should only come onto the market once they’ve been improved. Otherwise, it could cause unimaginable suffering,' says Villaronga.

In his article, Villaronga discusses the motivation behind his research and explores the problems associated with misgendering and discrimination in algorithms in closer detail. Read the full interview (in Dutch) in the Leidsch Dagblad (€).

Photo by Aideal Hwa through Unsplash

This website uses cookies.  More information.