Leiden University logo.

nl en

SAILS Lunch Time Seminar

Monday 17 January 2022
Online only

Rineke Verbrugge - Computational agents can help people improve their theory of mind

Rineke Verbrugge, Professor Logic and Cognition at Groningen University

Abstract: When engaging in social interaction, people rely on their ability to reason about other people’s mental states, including goals, intentions, and beliefs. This theory of mind ability allows them to more easily understand, predict, and even manipulate the behavior of others. People can also use their theory of mind to reason about the theory of mind of others, which allows them to understand sentences like “Alice believes that Bob does not know that she wrote a novel under pseudonym”. But while the usefulness of higher orders of theory of mind is apparent in many social interactions, empirical evidence so far suggested that people often do not use this ability spontaneously when playing games, even when doing so would be highly beneficial. 

In this lecture, we discuss some experiments in which we have attempted to encourage participants to engage in higher-order theory of mind reasoning by letting them play games against computational agents: the one-shot competitive Mod game; the turn-taking game Marble Drop; and the negotiation game Colored Trails. It turns out that we can entice people to use second-order theory of mind in Marble Drop and Colored Trails, and in the Mod game even third-order theory of mind.

We discuss different methods of estimating participants’ reasoning strategies in these games, some of them based only on their moves in a series of games, others based on reaction times or eye movements. In the coming era of hybrid intelligence, in which teams consist of humans, robots and software agents, it will be beneficial if the computational members of the team can diagnose the theory of mind levels of their human colleagues and even help them improve.


This website uses cookies.