Music critics regularly scour Thesaurus.com for the best adjectives to throw into their perfectly descriptive melodious disquisitions on the latest works from Drake, Radiohead, or whomever. And listeners of all walks have, since the beginning of music itself, been guilty of lazily pigeonholing artists into numerous socially constructed genres. But all of that can be (and should be) thrown out the window now, because new research suggests that, to perfectly match music to a listener’s personality, all you need are these three scientific measurables.
In a study titled “The song is you: Preferences for musical attribute dimensions reflect personality”, published in Social Psychological and Personality Science, scientists from McGill, Cambridge, and Stanford explain that the psychological effects of music all fit within the categories of arousal, valence, and depth.
“[Genres] are essentially defined by social elements and industry, but what we want to move toward is the actual attributes in music that people are preferring and having an emotional reaction to,” David Greenberg, a lead author of the study and music psychologist at Cambridge University and City University of New York, tells Inverse.
Rather than classifying music by genre, the study measures: arousal, the energy and intensity of the music; valence, the emotional tone from happy to sad; and depth, the intellectual sophistication of the music. Specific songs and artists might fall somewhere on the spectrum measured by the three qualities, or a song might be so overwhelmingly beat-driven or emotionally scarring that it neatly fits into one category.
This suggests that a slow, introspective gospel song from Chance The Rapper’s upcoming album could have the same depth as a track from Radiohead’s A Moon Shaped Pool. So a system of categorization based on Greenberg’s research would, surprisingly but rightfully, place the rap and rock works in the same bin.
Greenberg says categorizing music using these three categories could have large consequences across industry and research. Music therapists can speed up recovery times for patients, and streaming music services like Spotify, Pandora, YouTube, and Google Play Music could all benefit from analyzing these three categories in their own predictive music discovery algorithms.
Theoretically, streaming services could prompt users to take a personality test and have better, more emotionally striking music delivered. Services such as Apple Music and Google Play already prompt new users to select a series of artists to better formulate an algorithm for music taste, so why not take it a step further?
Greenberg already offers an extensive public online test that will identify a person’s music personality. It’s based on a well regarded psychological model called the Five Factor Model, which measures openness to experience, conscientiousness, extraversion, agreeableness, and neuroticism.
“People who were high on openness to experience were preferring … the depth in music, so they were looking at qualities that were more sophisticated and complex,” Greenberg says. “On the other hand, people who were high on neuroticism were preferring more intensity in music.”
The research also found that music that’s more personal to the listener elicits a more emotional response. Any music listener could have probably come up with that same theory themselves, but it’s good to have the research to back up. It’s just like Joni Mitchell said in a 2013 interview with the CBC: “The trick is if you listen to that music and you see me, you’re not getting anything out of it. If you listen to that music and you see yourself, it will probably make you cry and you’ll learn something about yourself and now you’re getting something out of it.”