AI predicts hit songs based on listeners’ heartbeats

It can determine whether a track is a hit or flop with 97% accuracy.
Credit: Adobe Stock / Annelisa Leinbach

Researchers have trained an AI to predict hit songs with incredible accuracy, just by looking at how your heart beats while listening to it — without ever analyzing the song itself.

ISO: hit songs: About 100,000 new songs are released every single day, and record labels, radio stations, and music apps are constantly trying to predict which will be hits.

If they’re on the mark, listeners are happy — and so are the platforms, with more royalties, subscribers, and ad revenue. If they’re consistently wrong, well, the music business is harsh.

Companies already rely heavily on algorithms that look at the song’s metadata (the artist, genre, language, etc.) and the music itself (notes, lyrics, etc.) to try to identify probable hit songs, but even with all that, these are only able to correctly predict whether a song will be a hit or not about 50% of the time.

“My lab previously identified what appears to be the brain’s valuation system for social and emotional experiences.”

Paul Zak

The beat goes on: In past research, Paul Zak, director of the Center for Neuroeconomics Studies at Claremont Graduate University (CGU), discovered that very subtle changes in heartbeats can predict brain activity associated with attention and emotional resonance.

“My lab previously identified what appears to be the brain’s valuation system for social and emotional experiences which I have called ‘Immersion,’” Zak told ZME Science

“In talks with a streaming service, they told me that they struggle to suggest new music for subscribers due to the high volume of new music,” he continued. “I thought measuring neurologic Immersion could help solve this problem.”

Listen up: To find out if he was right, the streaming service sent Zak’s team 24 recently released songs, with a roughly even split of tracks it considered hits (700,000+ streams within six months of release) and flops.

The CGU researchers had 33 volunteers listen to the songs while wearing noninvasive cardiac sensors. That data was then fed to Zak’s platform to determine their neurophysiologic responses to the songs — as expected, hits had higher “Immersion” scores than the flops.

“By applying machine learning to neurophysiologic data, we could almost perfectly identify hit songs.”

Paul Zak

Data from how brains responded to just 24 songs wouldn’t be enough to train an AI — the systems thrive on large data sets — so the researchers used the information they’d collected to simulate a synthetic data set of 10,000 neurophysiologic responses.

They then used half of the data set to train an AI to classify songs as hits or misses. When they tested the AI on the other half, and the real data from the volunteers, it was 97% accurate at classifying a song based on the neurophysiologic response to it.

“By applying machine learning to neurophysiologic data, we could almost perfectly identify hit songs,” said Zak. “That the neural activity of 33 people can predict if millions of others listened to new songs is quite amazing. Nothing close to this accuracy has ever been shown before.”

Looking ahead: The study was small, in terms of both participants and the number of songs. It didn’t include people from all demographics, either, and while synthetic data sets are supposed to be statistically accurate representations of the real data used to create them, that isn’t always the case.

If future research confirms Zak’s AI, though, radio stations and streaming platforms could start basing their recommendations on a new song’s likelihood of generating high levels of Immersion with listeners, in addition to — or even instead of — their traditional hit-picking tools. 

We’d love to hear from you! If you have a comment about this article or if you have a tip for a future Freethink story, please email us at [email protected].

ChatGPT forces us to ask: how much of “being human” belongs to us?
Large language models have been trained on massive amounts of “natural” human language — just like us. Does this make the robots part human?
You can now talk to ChatGPT and show it pictures
OpenAI is rolling out new features that let subscribers talk to ChatGPT and show it pictures, enabling more intuitive interactions.
AIs accurately predicted path of Hurricane Lee a week out
AI-based weather forecasting models developed by Google, Nvidia, and Huawei accurately predicted where Hurricane Lee would make landfall.
AI narrates 5,000 free audiobooks for Project Gutenberg
A new text-to-speech system developed by Microsoft and MIT was used to create nearly 5,000 audiobooks for Project Gutenberg.
Why Toyota is building a “kindergarten for robots”
Toyota is using a generative AI-based method to teach robots to peel veggies, prepare snacks, and perform other dexterous tasks.
Subscribe to Freethink for more great stories