
Tools for public authorities to be more transparent about algorithmic profiling
image: Joshua Sortino on Unsplash
Public authorities fail to inform citizens, or inform them too little or too late, about the use of algorithmic profiling in administrative decisions. This is clear from research conducted by Anne Meuwese and Fatma Çapkurt on the legally and practically responsible use of profiling algorithms.
The study was prompted by situations such as at the Dutch Education Executive Agency (DUO), where indirect discrimination occurred during checks on grants for students living away from home. It aims to prevent these types of abuses by enabling public authorities to recognise, apply and explain algorithmic profiling. Three concrete ‘products’ have been developed to help such public authorities deal with algorithmic profiling in a more transparent, cautious, and legally responsible way.
First, an academic article (in Dutch) was published in the Nederlands Juristenblad, providing an in-depth analysis of how Article 22 of the General Data Protection Regulation should be implemented. Among other things, the article discussed the Schufa case that came before the Court of Justice of the European Union. The article is critical of the current government policy in the Netherlands that fails to properly regulate profiling practices. In response to the publication, parliamentary questions have been put to three members of the government.
Second, a practical Roadmap (in Dutch) has been developed, which illustrates visually and accessibly how government organisations can use algorithms within the framework of the GDPR. The Roadmap sets out clear steps and points to consider for various positions within the government, from legal experts to communication advisors, and it underlines that providing clear information about algorithms is also a legal obligation.
Third, a public document was drawn up to inform citizens about their rights in relation to profiling. Using simple language, it explains how they can request access to their data and which signals could indicate the use of algorithms in decision-making. This not only strengthens their legal position, but also encourages governments to be accountable for their digital processes.
More information on the research project is available here (in Dutch).