Universiteit Leiden

nl en

Research project

Expression in music: Semantics and emotion

Music and language both communicate through sound, including timing and pitch. While language mainly conveys semantic meaning, instrumental music does not, and is typically seen as expressing emotion. This project investigates how well listeners recognize intended expressive feelings in music, instructed by metaphors. The research advances understanding of nonverbal communication and may help make AI-generated music more expressive and human-like.

Duration
2025 - 2026
Contact
Rebecca Schaefer
Funding
Kiem grant

About the project

Emotional communication through music is based on aspects of musical content, but also expressive cues, for instance in timing and dynamics. We will use expressively played melodies to help understand how individual differences in imagery ability, culture, age, and musical background affect emotional perception in music, based on a large online participant group. The resulting findings will be applied to expressive music generation, to assess whether human-produced expressive cues can be utilized to improve the listener’s aesthetic response to artificially generated music.

Aim of the project

This interdisciplinary research enhances our understanding of non-verbal communication in musical performance as well as its interpretation. In addition, the outcomes may address the current perceived lack of expressiveness in AI-generated music. Teaching generative models how to translate expressive cues to musical features could make AI-generated music more human and emotionally impactful, with applications beyond commercial music in, for example, healthcare settings.

Interdisciplinary character of the project

This research strengthens early-stage interdisciplinary collaboration and provides a basis for efforts to acquire follow-up funding. Our team brings together expertise in psychology, music cognition, and motor control (Dr. Schaefer) with expertise in AI and machine learning, human-machine interaction, and linguistics (Dr. Verhoef).

All these academic disciplines would benefit from this research line:

  • Cognitive science: deepening our understanding of how humans can convert abstract concepts such as images and metaphors into concrete actions leading to specific music performance characteristics.
  • Linguistics: exploring the intersection of language and music can shed light on the semantics of non-verbal communication. This can enhance our understanding of metaphorical language and its role in human cognition and communication.
  • Musicology: offering new methodologies for analyzing musical performances and compositions, contributing to a richer understanding of musical interpretation and pedagogy.
  • AI and machine learning: improving the design of algorithms that generate or interpret music. It can enhance human-computer interaction in creative fields.
This website uses cookies.  More information.