Universiteit Leiden

nl en

Social Science Matters: The surveillance society

Those who know their dystopian classics will inevitably associate the concept of surveillance society with the all-knowing oppressive force characterized as Big Brother in George Orwell’s novel 1984. However, surveillance permeats our society in many more subtle aspects than our worst fears about spy craft and all-seeing despots or governments. Who determines where the balance between privacy and control lies? How as a society do we make sure these powers are not abused? Our social scientists give their point of view.

Technology and privacy: trust or distrust

- Elise Swart, Education and Child Studies

The increasing possibilities and accessibility of technological applications and digital media have given a new dimension to upbringing and education. But what if technology is used as a way to have (a sense of) control over the physical world?

Many smartphones have apps to track the location of others in real time, there are special smartwatches for children with built-in GPS function and even GPS trackers that you can attach to your child's school bag. Recent research shows that, in The Netherlands, 1 in 5 parents use the possibility to track the location of their child and, additionally, 1 in 4 parents consider to start using these types of tools. Online systems in which parents can monitor their children's school performance are also increasingly providing information about the presence and absence of the child. Although parents indicate that they use location tracking to increase children's freedom of movement or for a sense of security, scientists also point to potential drawbacks to these surveillance methods (apart from the potential technical privacy protection problems, as demonstrated by the Norwegian Consumer Association). Empirical research into the potential downsides of these new technologies is still in its infancy, but scientists point out that, from a pedagogical, moral and ethical perspective, continuous or frequent monitoring of a child's location can violate the child’s right to privacy and affect the development of trust between parent and child, the child’s sense of responsibility and autonomy and the possibility to learning how to deal with risks.

As with all new (digital) possibilities in upbringing and education, we as parents, teachers and scientists should regularly ask ourselves, does the fact that something is possible also mean that it is desirable?

The call is coming from inside the house!

- John Boy, Cultural Anthropology and Development Sociology

Screening Surveillance is a series of short films produced by sava saheli singh. During an event featuring singh hosted by our research cluster, we watched two of the films, which speculate on a near future in which “smart cities” and fully integrated health data systems envelope daily routines. The subsequent discussion, however, focused on issues that were even closer to home. Our Faculty had recently adopted the automated proctoring system Proctorio, which most participants in our event perceived as an unacceptable imposition on our students, but also as yet another unfortunate step of our own institution into being complicit with surveillance capitalism. Proctorio turns our students’ activities into data to be monetized, and such datafication enables a process of “accumulation by surveillance”. Online proctoring alone is projected to become a 20 billion euro industry by next year, and the wider ed-tech sector is valued at nearly 100 billion euros.

That would be fine if this industry served our institution’s core business of teaching and learning. The case of Proctorio at least suggests otherwise. The company has repeatedly sought to silence academics critical of their product and algorithmic proctoring in general, and has even tried to pressure a peer-reviewed journal, Hybrid Pedagogy, into retracting an article critical of proctoring software. All the while, evidence that use of Proctorio and related systems are ableist and discriminatory has been mounting. I was heartened to see that students around the world, including students in our own Faculty, refused proctoring.

As we work out what the post-pandemic university will look like, our institutions’ relationship with these kinds of predatory systems must be a major issue. Will we be able to slough them off, or will they be normalized the way Turnitin has become normalized? Recent experiences incline me toward pessimism. I am advising a group of students doing research on StudyStream, a voluntary online learning space begun during the pandemic. Popularized by a series of TikTok videos, the core experience of StudyStream consists of Zoom rooms and a Discord server where students from around the world get together to subject themselves to voluntary mutual surveillance. This is not a niche thing; hundreds of thousands of students sign on to take part. While StudyStream is not currently monetizing this experience and may well find a business model that does not require them to exploit user data, I am struck by how quickly intense surveillance has become a taken-for-granted part of students’ learning environments. As educators, our responsibility to defend education as “the practice of freedomhas a newfound urgency at this time.

#CameraGate or: Why we should be weary of allowing invading technology in public spaces

- Hilde van Meegdenburg, Political Science

On November 17, 2021, Mare published their article on the installment of smart cameras throughout the university grounds: “Suddenly there are smart cameras everywhere”. These cameras harbour great potential: They can accurately register room occupancy; expose frequented routes; and track movements. They can also register the sex and height of passers-by. And the producer boasts that their “sensors are equipped with various artificial intelligence extensions that continuously evolve”. The university prefers to call them “classroom scanners”—a much more innocent term—and stresses that none of the more privacy invading features will be used, not now and not in the future. Then why am I concerned? Two closely-related reasons.

Function creep - Promises and intentions, unfortunately, offer no guarantees for the future. To be sure, I do believe the Board when they say they have no interest “whatsoever” in gathering additional data, but interests change and the current University Board cannot foreclose decisions of any of the many boards that will follow. As Marc Schuilenburg (chair in Digital Surveillance at the Erasmus University) was quoted saying in the Volkskrant; Ten years on, surveillance systems are never used in line with the original intentions. With the cameras pre-installed, and the lines to the supplier—who will gladly take the extra assignment—short, adding more invasive features and analytics is always only a mouse-click away. And with the capacity at the ready, adding just a whiff of sunk cost fallacy, the decision to decrease the privacy settings—‘only for this short moment and only for as long as the situation lasts’—is likely to proof tempting. And emergency powers tend to ratchet.

Technology can stifle - Even if certain options are not turned on at the moment, the potential inheres in the hardware and that potential can have a repressive effect. At all times and across the globe, student movements and rebellions have proven to be critical voices against war, oppression, racism, sexism, and bigotry in general. Universities often are, and always should be, spaces where nascent movements can assemble and raise their voices. Knowing that every hallway and every classroom is equipped with a little double-eyed spy that can see how long you lingered and looked at a certain poster does not foster freedom and trust. And what when facial recognition software does become available for these specific cameras? What when a little program is written that allows them to register not only sex but also skin color? Whether we have something to hide depends not only on our own behavior but also on the behaviour condoned by those in power. And, and that gets me back to the first point, interests, but also ideas about appropriate behavior, change. For the better or the worse, contemporary promises and intentions offer no guarantees for the future.

If the above scenarios seem unlikely to you, consider still this last point: Why buy technology that you do not intent to use, not now and not ever? Considering the guarantees that are necessary because of the inherent capabilities, the potential, of the system, why not obtain simpler means? The General Data Protection Regulations (GDPR) suggest less privacy invading means should be preferred. Privacy invading technology should only be used when necessary—when there are no other means to obtain the end—and only when the functional gains are proportional to the violation.

In the end, the best guarantee against contemporary and future privacy violations is to not install hardware that makes such violations possible, to make an intentional choice for privacy by design. It is my suggestion the university makes that choice: To, despite the initial costs, and despite the fact that other means to measure room occupancy may be more cumbersome or less accurate, replace the cameras with technology that, by design, cannot stifle and repress and that cannot do more than the simple job it is intended to do

Come on, we don’t like our privacy that much…

- Roy de Kleijn, Psychology

We’re all very concerned about our privacy. So much, that we are willing to delete our Facebook account (but not WhatsApp or Instagram!) and switch to DuckDuckGo or other privacy-minded search engines (but we still use Google if the need arises).

While many like the idea of regulators protecting our privacy, who actually likes (let alone reads) the EU-mandated cookie warnings whenever you visit a new website, and who reads through the terms and conditions when signing up for something? In other words, it’s about finding a balance between privacy and convenience.

Of course, some concerned people will tell you that privacy is a basic human right and should be protected at all costs. But research suggests that the people expressing concern often do not act accordingly and people are in fact willing to reveal personal information for relatively small rewards, a phenomenon known as the privacy paradox.

For example, Facebook makes its money by selling targeted advertising using the collected data of its users, and makes around $9 per month per user doing so. Are users willing to pay $9 per month to disincentivize Facebook to collect their data? You guessed it… only about 8% of users are willing to do so.

It seems to me that most people are not that concerned about privacy, at least not enough to actively protect it or be inconvenienced by it.

Social Science Matters – a soapbox for social scientists

Social Science Matters is an online variant on London’s famous Speakers’ Corner – a platform for the researchers in the various disciplines in the Faculty of Social and Behavioural Sciences to react to the news. This soapbox gives the social scientists of the faculty the opportunity to voice their opinions on current affairs from the point of view of their own areas of expertise.

This website uses cookies.  More information.