No matter how emphatically a person bobs their head or hums along to the music in their headphones, you can never totally sure what they’re listening to unless you ask them. Many people, myself included, find great comfort in the certainty that nobody can tell that I kinda like Justin Timberlake’s extremely bad new foray into country music. If you share this sentiment, prepare to be rattled by new research from the D’Or Institute for Research and Education.

On Friday, in the journal Scientific Reports, the researchers behind the study reported in a new paper that they have successfully used human brain scans to identify what genre of music people are listening to with up to 85 percent accuracy. They used a technique called brain decoding, which rests on the assumption that the brain responds in unique ways to specific stimuli — in this case, different characteristics of a song — and that each of these stimuli can be matched with an observable brain response to ultimately form a code.

“Machines will be able to translate our musical thoughts into songs”, said study co-author and Federal University of Rio de Janeiro Ph.D. student Sebastian Hoefle, who also does research with the D’Or Institute, in a statement published Friday.

Chet Baker's 'Albert's House' was one of the songs in the jazz/tenderness-evoking category.

The team started by using magnetic resonance imaging to capture, using a computer, the brain activity of six volunteers as they listened to 40 excerpts of songs from different genres of music, including classical, rock, pop, folk, and jazz. The songs were categorized further into tender songs (Chet Baker’s “Albert’s House”) and joy-evoking songs (Dave Brubeck’s “Take Five”), just to see how nuanced the code could get, and each was assessed for features like tonality, dynamics, rhythm, and timbre, so that the computer could match these qualities up with the brain response.

Then, they reversed the experiment to see whether the computer could predict, on the basis of brain activity, what song was being listened to. In the first trial, the computer showed 85 percent accuracy when it selected the correct answer from two songs; in the second, when it chose the correct answer from ten songs, it still got songs right 74 percent of the time.

Dave Brubeck's classic "Take Five" represented a jazz/joy-inducing track.

This isn’t the first time a computer has been able to identify songs based on listeners’ brain patterns, but it does mark a new standard for accuracy. The point of this sort of work is not, fortunately, to music-shame unwitting headphone-wearers who think they can get away with listening to old John Mayer unnoticed but rather to improve brain-computer interfaces for people who are unable to communicate through listening or speech. That, and breaking down the musical features of a song that elicit the most positive brain response in listeners, perhaps to compose better-performing pop songs in the future. Justin Timberlake, take note.


If you liked this article, check out this video: "'Cuphead' Xbox One High Seas 1080p"