In psychology, theory of mind (ToM) refers to the capacity to understand other people by ascribing mental states to them. This ability that humans develop around the age of 6 years old, helps at different life situations and in particular to develop empathy and make successful strategic decisions. In this project we focus on develop computational models for ToM that can be embedded in artificial agents to facilitate successful interaction with humans.
As AI becomes more prevalent in our daily lives, it will be employed more in collaborative situations where humans and AI work together to achieve a common goal. To comprehend the dynamics of human-AI contact, one must first model human-human interaction. ToM is one of the fundamental skills to model human interactions. Hence, in this thesis, we aim to design computational models for this particular human skill that can be used by an AI system for a successful interaction with humans.
Behavioral studies, in which participants are instructed to play a game that involves theory of mind, are a frequent method for studying these interaction dynamics. We choose game theory to realize such experiments. In particular we rely on games which are referred to in game theory as “games of incomplete information” (e.g., signaling games). Such games are useful to model scenarios in which agents must adopt a strategy while not knowing all about their opponent's preferences. The standard approach in game theory is to assume that the players are rational, never the less behavioral sciences have shown the human irrationality in a large spectrum of decision-making processes. In this thesis, we aim to explore the use of ToM in such games and whether it can explain human irrationality (e.g., acting honest in a signaling game).
- Design computational model for ToM and its different orders and embed those in artificial agents
- Analyze the role ToM plays in human decisions for the case of a set of selected games
- Identify the corelation between ToM and irrational decisions
The work in this master thesis entails:
- Literature study on ToM, games of incomplete information
- Writing simple programs to model the different levels of ToM
- Design and conduct gaming experiments between agents and humans.
- Evaluating the role of ToM on human decisions in the context of the considered games.
- K. Veltman, H. de Weerd, and R. Verbrugge. Training the use of theory of mind using artificial agents. Virtual Agents for Social Skills Training, 13(1):3-18, 2019. doi:10.1007/s12193-018-0287-x.
- H. de Weerd, R. Verbrugge and B. Verheij. Negotiating with other minds: The role of recursive theory of mind in negotiation with incomplete information. Autonomous Agents and Multi-Agent Systems, 31(2): 250–287, 2017. doi:10.1007/s10458-015-9317-1.
- Courses: Programming in Python (INF-22306)
- Required skills/knowledge: Computational modelling, computational psychology, interest about decision-making processes, psychology.
Key words: Agent-based modelling, (Agent) Simulation Modelling, Computational psychology, decision-making, behavioral societies , computational societies
Yara Khaluf (firstname.lastname@example.org)