Universiteit Leiden

nl en

‘Privacy is shifting from Big Brother to Kafka’

On the Day of Privacy, 28 January, the European Commission is calling on citizens to make sure they protect their personal data. But how do you do that, and against what, exactly? Privacy researcher Bart Custers explains.

Every year on 28 January the European Commission 'celebrates' the Day of Privacy.  It was on this day in 1981 that the European Data Proection Act was signed. 'The Act was in fact the first set of regulations for protecting personal data,' Custers explains. He conducts research on privacy, including online privacy, within the eLaw group at the Faculty of Law. 'The Act was compiled before the internet age. All kinds of databases were due to be set up and the Council of Europe decided that rules were needed on how to handle all that information.'  

Rules for personal data

‘These are very basic rules,' Custers says. 'You have to let a person know in advance what their data will be used for, as an example, and obviously you can't change your goal afterwards, or use the data for some other purpose.' Another point is that you are not allowed to store more data than needed for the purpose, you have to make sure the data are well protected and you have to be transparent about how you handle the data. A European guideline was drawn up in 1995 based on these rules, and our Dutch legislation is in turn based on this European guideline. The first ruling was the Protection of Personal Data Act and since May last year the new General Data Protection Act (AVG).' 

Nothing new in the Act

If these rules sound familiar, there's good reason for that. 'It's quite remarkable that that very first Act  is not so very different from our new Data Protection Act. The difference is that the AVG stipulates some heavy fines for people who don't adhere to the rules.' These fines are substantial: a company can be fined 2 to 4% of its worldwide income. 'These are sums are big enough to make even the big multinationals like Google and Amazon sit up and listen.' And that's a good thing, says Custers. 'Our society and our companies have developed enormously since 1981, and it's become much easier for them to share data. It's all too easy to break the rule about not using data for a different purpose.' 

Predicting sensitive information

The enormous amounts of data that we deal with today and the increase in the calculating power of our computers means it is possible to make predictions. We refer to this as data science. ‘Facebook and Google are already doing that now: based on their data they can predict what kind of person you are and what you like. But it goes further, Custers says, and that's where the threat to the public comes in. 'With all that data it's also possible to predict sensitive information, such as your religion, sexual preference and even your health. And then it's really important who gets hold of this information. If a doctor finds you have particular health risks, it may be an advantage for you to know about them, but on the other hand you may not want the information to get into the hands of your insurer.' 

From Big Brother to Kafka

Custer sees a further threat to our privacy in the fact that decisions are increasingly being made by algorithms - and the human link is completely missing from the process.  But that also means an important safety valve has disappeared. If an algorithm takes a wrong decision on the basis of errors or some problem with the data, that can't simply be amended. 'Take a speeding fine, for example. That's completely automated. But what happens if your number plate is read wrongly?  You have to lodge an appeal and a human being has to be able to put the mistake right. Otherwise you'll get a situation like 'The Trial, Kafka's famous novel in which a man is wrongly accused of a crime and convicted,  even though he knows nothing about it.' According to Custers, the nature of privacy has changed in recent years. 'From Big Brother who watches everything you do, to Kafka: you're faced with a decision on which you have had absolutely no influence.' 

Text: Marieke Epping
Mail the editors

How dan you protect your personal data online?

The dangers of all these online data are clear. But what can we do to secure our personal data?  Custers: ‘It seems logical, but think carefully about what  you share and with whom. And bear in mind that there are always other people looking at your information.' Take your Gmail account, for example: you mail friends about your next holiday and suddenly you're receiving adverts for airline tickets. And Whatsapp too: the messages you send are encrypted, but Whatsapp itself has the key. 'They state that they don't listen in, but in theory they're perfectly able to do so. One option is to choose paid services, Custers says. 'They're based on a different business model, and they're a bit more cautious about how they handle your data.'  

Custers' research also aims to contribute to safeguarding privacy. 'In the legal field, we explore whether privacy can be maintained by adding extra safeguards and clauses to the AVG. And in the technical area we are looking at privacy-by-design: developing search strategies that can roam around big data but with certain restrictions, so that they can't trace patterns based on ethnicity, for example. Non-discriminatory algorithms, in other words.' 

This website uses cookies.  More information.