Leiden University logo.

nl en

Fall of Misinformation Series: Suzan Verberne

Misinformation spreads easily and fast. It gets presented as news, whereas actual news gets dismissed as fake. Conflicting streams of information allows all sides to cherry-pick whatever is most comfortable, boosting degrees of confidence and confusing the deliberation of both politicians and voters. Conspiracy theories have growing numbers of adamant followers who see misinformation where others see truth. From COVID-19 to QAnon, misinformation is on all our minds. What exactly is happening and why? Have we entered a post-truth era? What can we do as university and are we doing enough? The Young Academy Leiden approached some of the researchers currently working on the topic.

There is a widespread worry about misinformation and fake news, and many are pointing to social media platforms as being at least partly the ones to blame. How do you understand the phenomenon and how you see the role of social media here? 

Misinformation is a challenging problem, with more aspects than just the dichotomy between truth and falsity. Information can be factually true but misleading. An example is the message “50% of the people with covid in the hospital are vaccinated!! What does that tell you???!!”. The statistic of 50% could be correct, but the implication made by the author (that vaccines don’t work) is incorrect. Information is expressed in language and social media language is often ambiguous, subjective, and emotional. Social media companies promote free speech, want to connect people, and from a commercial perspective they want to increase engagement on their platform: the longer people keep scrolling through their news feed, the more advertisements they see. For that reason, the algorithms that determine the content of your news feed have been optimized for engagement: the more something is liked, shared, and commented on, the more likely it is that you see it in your feed; independent of the quality of the shared information.

The investigative journal series “The Facebook Files” by the Wall Street Journal describes clearly how an algorithm that was intended for more meaningful social interaction was in fact pushing misinformation and toxic content to people’s news feeds [1].

 

Can you tell us something about your own work on this topic? I understand that you have studied false news dissemination? 

We have studied the spread of so-called junk news on Facebook [2]. Junk news is low quality content from click-bait-type websites. We compared the spread of junk news by Dutch Facebook users to that of mainstream news; news outlets such as AD, RTL, and nu.nl. We found that during the period 2013–2017 the total number of user interactions with junk news significantly exceeded that with mainstream news. Over 5 Million of the 10 Million Dutch Facebook users had interacted with a junk news post at least once. This confirmed other research about Twitter that has shown that false news spreads faster than true news [3]. 

 

Does your research underwrite any steps that could be taken to curb the spread of misinformation? Do you think that automated text analysis can help? 

I supervise a number of student projects addressing misinformation on social media. We see that automatic classification of news between true, false and misleading is difficult. One example was a student project about the spread of false news during the Indonesian presidential election [3]. The student found that distinguishing truths from falsehoods using machine learning models that look at the textual content of tweets, is possible with up to 80% precision, but misleading news is much more difficult to recognize. And when our model has to filter false news from a stream of data in which also neutral or unrelated messages occur, it becomes even much harder.

 

 Do you have plans for further work on the topic? 

Yes. I would like to address the relation between news coverage on a topic and social media utterances about the same topic. What is the effect of viewpoints, emotions, background knowledge, and the sources that someone read before they posted their message? I want to move away from the true/false dichotomy and investigate how news spreads, and what the roles of viewpoints and emotions are in the spreading of news on social media.

 

Last year you organized MISDOOM, a multi-disciplinary conference on misinformation. What is your view of the current state of research on the topic? How interdisciplinary are the current approaches? 

For the recognition and filtering of misinformation it is in my view necessary to combine automated text analysis with network analysis (who is connected to who, and what are the clusters in which information is shared), and to combine computerized analysis with manual analysis by experts. Fully automated filtering is not realistic, but automated text and network analysis can definitely assist human fact checkers in doing their job.

Any final comments?

 I think that it is worrying that the covid pandemic has led to more distrust in government, science, and media. It is very challenging that this distrust has been nurtured by conspiracy theories and disinformation [5]. It is almost impossible for the government to reach and connect to people who do not trust the government or the media. I think this human aspect is the most important concern about the spread of misinformation and disinformation.

  

[1]  www.wsj.com/articles/the-facebook-files-11631713039 (you can find the podcast on Spotify)

[2]  Peter Burger, Soeradj Kanhai, Alexander Pleijter, Suzan Verberne (2019). The reach of commercially motivated junk news on Facebook. PLoS ONE 14(8): e0220446.  doi.org/10.1371/journal.pone.0220446

[3]  Soroush Vosough, Deb Roy, and Sinan Aral. "The spread of true and false news online." Science 359.6380 (2018): 1146-1151.

[4]  Rayan Suryadikara, Suzan Verberne, Frank W. Takes (2020). False News Classification and Dissemination: The case of the 2019 Indonesian Presidential Election (pdf). In the Proceedings of the CIKM 2020 Workshops, co-located with 29th ACM International Conference on Information and Knowledge Management (CIKM 2020), CEUR-WS, Vol. 2699.

[5]  Dreigingsbeeld NCTV: Aanslag Nederland voorstelbaar, dreiging vooral van eenlingen, Nieuwsbericht NCTV, 15-10-2020 www.nctv.nl/onderwerpen/dtn/nieuws/2020/10/15/dreigingsbeeld-nctv-aanslag-nederland-voorstelbaar-dreiging-vooral-van-eenlingen

This website uses cookies.