AI predicts hit songs based on listeners’ heartbeats

It can determine whether a track is a hit or flop with 97% accuracy.
Credit: Adobe Stock / Annelisa Leinbach

Researchers have trained an AI to predict hit songs with incredible accuracy, just by looking at how your heart beats while listening to it — without ever analyzing the song itself.

ISO: hit songs: About 100,000 new songs are released every single day, and record labels, radio stations, and music apps are constantly trying to predict which will be hits.

If they’re on the mark, listeners are happy — and so are the platforms, with more royalties, subscribers, and ad revenue. If they’re consistently wrong, well, the music business is harsh.

Companies already rely heavily on algorithms that look at the song’s metadata (the artist, genre, language, etc.) and the music itself (notes, lyrics, etc.) to try to identify probable hit songs, but even with all that, these are only able to correctly predict whether a song will be a hit or not about 50% of the time.

“My lab previously identified what appears to be the brain’s valuation system for social and emotional experiences.”

Paul Zak

The beat goes on: In past research, Paul Zak, director of the Center for Neuroeconomics Studies at Claremont Graduate University (CGU), discovered that very subtle changes in heartbeats can predict brain activity associated with attention and emotional resonance.

“My lab previously identified what appears to be the brain’s valuation system for social and emotional experiences which I have called ‘Immersion,’” Zak told ZME Science

“In talks with a streaming service, they told me that they struggle to suggest new music for subscribers due to the high volume of new music,” he continued. “I thought measuring neurologic Immersion could help solve this problem.”

Listen up: To find out if he was right, the streaming service sent Zak’s team 24 recently released songs, with a roughly even split of tracks it considered hits (700,000+ streams within six months of release) and flops.

The CGU researchers had 33 volunteers listen to the songs while wearing noninvasive cardiac sensors. That data was then fed to Zak’s platform to determine their neurophysiologic responses to the songs — as expected, hits had higher “Immersion” scores than the flops.

“By applying machine learning to neurophysiologic data, we could almost perfectly identify hit songs.”

Paul Zak

Data from how brains responded to just 24 songs wouldn’t be enough to train an AI — the systems thrive on large data sets — so the researchers used the information they’d collected to simulate a synthetic data set of 10,000 neurophysiologic responses.

They then used half of the data set to train an AI to classify songs as hits or misses. When they tested the AI on the other half, and the real data from the volunteers, it was 97% accurate at classifying a song based on the neurophysiologic response to it.

“By applying machine learning to neurophysiologic data, we could almost perfectly identify hit songs,” said Zak. “That the neural activity of 33 people can predict if millions of others listened to new songs is quite amazing. Nothing close to this accuracy has ever been shown before.”

Looking ahead: The study was small, in terms of both participants and the number of songs. It didn’t include people from all demographics, either, and while synthetic data sets are supposed to be statistically accurate representations of the real data used to create them, that isn’t always the case.

If future research confirms Zak’s AI, though, radio stations and streaming platforms could start basing their recommendations on a new song’s likelihood of generating high levels of Immersion with listeners, in addition to — or even instead of — their traditional hit-picking tools. 

We’d love to hear from you! If you have a comment about this article or if you have a tip for a future Freethink story, please email us at [email protected].

OpenAI’s GPT-4 outperforms doctors in another new study
OpenAI’s most powerful AI model, GPT-4, outperformed junior doctors in deciding how to treat patients with eye problems.
Watch the first AI vs. human dogfight using military jets
An AI fighter pilot faced off against a human pilot in a “dogfight” using actual planes — a huge milestone in military automation.
AI can help predict whether a patient will respond to specific tuberculosis treatments
Instead of a one-size-fits-all treatment approach, AI could help personalize treatments for each patient to provide the best outcomes.
New AI music generator makes songs from text prompts
AI music generators — AIs that create new music based on users’ text prompts — are lowering the bar for music creation, for better or worse.
Meta’s AI assistant just got a major upgrade — here’s how you can use it
Meta has upgraded its AI assistant, Meta AI, with a new LLM (Llama 3) and made it available in far more places.
Subscribe to Freethink for more great stories