Dr. Edward W. Large, director of the Music Dynamics Laboratory at the University of  Connecticut, and a group of international collaborators have unveiled a groundbreaking  theory explaining how the brain’s naturally occurring rhythms resonate with the elements of music.  

Published in Nature Reviews Neuroscience, the paradigm-shifting paper introduces Neural Resonance Theory (NRT). This theory offers a new way to understand how the  brain transforms sound into the human experience of music. 

“Music isn’t just about the sound outside of us,” says Large. “It’s about how the sound  interacts with dynamic rhythmic patterns in the brain.” 

NRT challenges older theories that depend on prediction and learned expectations to  explain why humans enjoy music. The new theory offers an explanation based on  physical processes: the rhythms (oscillations) of the brain’s neural activity synchronize with the pitches and rhythms in the music. According to NRT, people can keep time,  dance, and effectively improvise music because human biological processes can sync  with music, from simple tunes to complex melodies. 

“Music cognition is the physical embodiment of these resonance relationships,” says  Large. 

One of the most compelling aspects of NRT is that it bridges science with cross-cultural understanding. It shows that musical structures found around the world—like pulse, consonance, and octave relationships—map onto the brain’s most stable resonant  states. This could explain why music can feel universal, even when the style is  unfamiliar. 

In addition to Large, who is a professor of psychological sciences and of physics, the  multi-institutional collaboration featured other researchers at the University of  Connecticut and at the University of Groningen (Netherlands), the University of Illinois  Chicago, Queen Mary University of London, and McGill University (Canada). 

Beyond advancing scientific understanding of music cognition, these findings also have potential to revolutionize applications of music in AI, education, the arts, and healthcare. 

Real-world implications include: 

  • Music and Health: NRT may improve music therapy for conditions like Alzheimer’s  disease, Parkinson’s disease, depression, or stroke by helping tailor rhythm-based  interventions. 
  • Smarter AI Music Tools: Machines trained on neural resonance could produce more emotionally intelligent and culturally aware music. 
  • Inclusive Education: Learning tools could use resonance to help people better grasp rhythm and pitch.
  • Global Understanding: The theory affirms shared biological roots of musical expression across cultures. 

Large is the founder and Chief Executive Officer of Oscillo Biosciences, a healthcare  startup which uses music and light therapy to help mitigate disease progression among Alzheimer’s patients. It is a highly promising application of the neural resonance theory developed by Large’s research. 

Other UConn contributors to the research include Ji Chul Kim, an assistant research  professor in the department of psychological sciences and co-founder/Chief Science  Officer at Oscillo Biosciences; and Parker Tichko, who received his Ph.D. in  psychological sciences from UConn in 2019. 

Click here to read the full paper, Musical Neurodynamics.