Reijer Passchier’s AI research cited in Follow The Money article
Government and public bodies in the Netherlands increasingly make use of complex data that has been collected on citizens. But the connections between all this data are nontransparent and the algorithms government services use to process the data are difficult to verify. This is a recipe for persistent system errors that can have major consequences as was seen in the recent childcare benefits scandal. The Dutch House of Representatives has now passed a new law that allows data connections that go even further.
Advanced data systems are necessary. A government with a strong digital basis is essential for running society these days. The use of innovative technologies ensures that government authorities keep connected to the world around them and can ensure efficient decision-making, better accessibility and uniformity in services.
But there is also a downside, says Reijer Passchier, lecturer in constitutional law at Leiden University and the Open University, in an article by Sebastiaan Brommersma, research journalist for Follow the Money. Passchier recently published a book on the impact of digitalisation on the rule of law. Passchier: ‘Digitalisation has made the government nontransparent. Citizens, courts and parliament often don’t understand how government systems work. The result is that the executive branch in particular, which uses the most technology by far, can no longer be properly monitored. The rule of law system is therefore out of balance.’
Not being able to monitor the administration has already led to serious violations of civil rights according to Passchier. He refers to the notorious anti-fraud detection system, in Dutch called the Systeem Risico Indicatie (SyRI). This system connects data from different public authorities. It then uses algorithms to search specifically in disadvantaged neighbourhoods for ‘increased risks’ of social services being misused. Those who ‘surfaced’ in the system were then subjected to extra checks. SyRI was prohibited by a court ruling early in 2020. The court found that the system was in breach of the European Convention on Human Rights and was insufficiently transparent and verifiable. The affected persons did not know what data SyRI had processed exactly, how or why the system did this and they were not informed that they had been noted as a ‘risk’ by the system.