Researchers have trained an AI to predict hit songs with incredible accuracy, just by looking at how your heart beats while listening to it — without ever analyzing the song itself.
If they’re on the mark, listeners are happy — and so are the platforms, with more royalties, subscribers, and ad revenue. If they’re consistently wrong, well, the music business is harsh.
Companies already rely heavily on algorithms that look at the song’s metadata (the artist, genre, language, etc.) and the music itself (notes, lyrics, etc.) to try to identify probable hit songs, but even with all that, these are only able to correctly predict whether a song will be a hit or not about 50% of the time.
“My lab previously identified what appears to be the brain’s valuation system for social and emotional experiences.”Paul Zak
The beat goes on: In past research, Paul Zak, director of the Center for Neuroeconomics Studies at Claremont Graduate University (CGU), discovered that very subtle changes in heartbeats can predict brain activity associated with attention and emotional resonance.
“My lab previously identified what appears to be the brain’s valuation system for social and emotional experiences which I have called ‘Immersion,’” Zak told ZME Science.
“In talks with a streaming service, they told me that they struggle to suggest new music for subscribers due to the high volume of new music,” he continued. “I thought measuring neurologic Immersion could help solve this problem.”
Listen up: To find out if he was right, the streaming service sent Zak’s team 24 recently released songs, with a roughly even split of tracks it considered hits (700,000+ streams within six months of release) and flops.
The CGU researchers had 33 volunteers listen to the songs while wearing noninvasive cardiac sensors. That data was then fed to Zak’s platform to determine their neurophysiologic responses to the songs — as expected, hits had higher “Immersion” scores than the flops.
Data from how brains responded to just 24 songs wouldn’t be enough to train an AI — the systems thrive on large data sets — so the researchers used the information they’d collected to simulate a synthetic data set of 10,000 neurophysiologic responses.
They then used half of the data set to train an AI to classify songs as hits or misses. When they tested the AI on the other half, and the real data from the volunteers, it was 97% accurate at classifying a song based on the neurophysiologic response to it.
“By applying machine learning to neurophysiologic data, we could almost perfectly identify hit songs,” said Zak. “That the neural activity of 33 people can predict if millions of others listened to new songs is quite amazing. Nothing close to this accuracy has ever been shown before.”
Looking ahead: The study was small, in terms of both participants and the number of songs. It didn’t include people from all demographics, either, and while synthetic data sets are supposed to be statistically accurate representations of the real data used to create them, that isn’t always the case.
If future research confirms Zak’s AI, though, radio stations and streaming platforms could start basing their recommendations on a new song’s likelihood of generating high levels of Immersion with listeners, in addition to — or even instead of — their traditional hit-picking tools.
We’d love to hear from you! If you have a comment about this article or if you have a tip for a future Freethink story, please email us at [email protected].