Brain & Music
Sponsored by USC, NTUA
Entrainment in Expert and Novice Musicians
Human brains and bodies naturally synchronize with musical rhythms, a phenomenon known as entrainment. While this phenomenon is foundational to musical experience, little is known about how musical expertise modulates entrainment processes across neural, physiological, and behavioral dimensions, particularly in naturalistic settings. Existing studies largely focus on isolated modalities or controlled stimuli, limiting our understanding of holistic musical engagement. The objective of this project is to investigate how musical training shapes multimodal entrainment during naturalistic music listening. By integrating electroencephalography (EEG), physiological sensors, and behavioral monitoring with wearables, we will collect multimodal data from expert and novice listeners, aiming to capture a broad spectrum of biobehavioral responses. Furthermore, we will leverage artificial intelligence (AI) methods inspired from time-series and speech modeling, to uncover entrainment patterns from this multimodal setup.
Our central hypothesis is that expert musicians will exhibit stronger, more sustained, and more nuanced entrainment across measures, and possibly across an extended range of musical attributes, reflecting enhanced predictive timing and emotional engagement. Hence, we propose the following research goals for the project: Aim 1: Quantify neural and physiological entrainment to music in expert musicians and novices. Here we are interested in examining the computational methods that could map complex multimodal behaviors to well-defined music stimuli features. Aim 2: Identify differences in attentional strategies and emotional responses between groups. We want to answer the question of whether expert musicians experience entrainment differently than novices, specifically looking at the temporal dynamics of auditory attention and engagement. Aim 3: Determine which musical features drive entrainment. Here we will attempt to isolate and extract the structural and acoustic elements of music that influence entrainment.
Affective Responses to Musical Stimuli
The scientific study of human emotions is well established in Psychology and Neuroscience, yet from a computational perspective it remains comparatively underexplored. While Artificial Intelligence has advanced remarkably in modeling rational intelligence, reliable systems for affective analysis are lacking. Emotions are inherently subjective, context dependent, and vary across individuals, posing fundamental challenges. Most computational studies rely on behavioral cues—speech, text, or facial expressions—yet these are indirect proxies. In this thesis, we instead analyze brain signals, focusing on the Electroencephalogram (EEG), as more objective indicators of affective states. Physiological and neural recordings not only offer greater reliability but also hold promise for clinical applications such as brain disorder treatment and rehabilitation. Music is used as the emotion-eliciting stimulus, given its profound and universal impact on human emotion.
Our approach consists of two main parts. First, we investigate the complex structure of EEG signals through novel feature extraction schemes based on two multifractal algorithms: Multiscale Fractal Dimension and Multifractal Detrended Fluctuation Analysis. These features quantify signal complexity across multiple timescales and outperform standard baselines in emotion recognition, showing competitive results in subject-independent settings and highlighting arousal as strongly tied to EEG complexity. Second, we design a two-branch neural network as a bimodal EEG–music framework, learning shared latent representations between EEG responses and their music stimuli. This model enables supervised emotion recognition and music retrieval from EEG queries. Applied to independent subjects, it further reveals patterns in brain–music correspondence, temporal dynamics of music-induced emotions, and activated brain regions. Overall, this study advances the interpretation of complex EEG activity and demonstrates how music can effectively stimulate and reveal the brain’s affective responses.