Science

U2 Music Videos Are Now Being Used in Artificial Intelligence Study

Future software will be able to identify objects and our emotional attachment. 

Justin Sullivan/Getty Images

Artificial intelligence could one day scan the music videos we watch to come up with predictive music discovery options based on the emotions of the performer. Which means that A.I. will soon be able to recognize Bono’s sad face and serve you more mopey Bono, or perhaps something more smiley.

The technology to do so isn’t quite there yet, but Diane Rasmussen Pennington, a lecturer at the University of Strathclyde in Glasgow, Scotland conducted research on 150 videos made by U2 fans and recorded non-text based emotional cues to demonstrate which facial features and objects might be most useful to future software programs.

“Full disclosure, U2 has been my favorite band for a long time,” Pennington tells Inverse, adding that there were a number of other reasons to use the band’s videos as testing ground. “Objectively, from reading I’ve done and not just my perception as a fan, generally the people who like U2’s music find it to be a very emotional experience when they listen to it, maybe more so than other bands. … There’s something almost highly spiritual for fans of the band.”

The Edge of U2 performs live on the pyramid stage during the Glastonbury Festival at Worthy Farm, Pilton on June 24, 2011 in Glastonbury, England. 

Ian Gavan/Getty Images

Pennington chose to focus on just one U2 tune, “Song For Someone” off the band’s controversial 2014 record Songs of Innocence, which is sitting in a ton of people’s iTunes whether it’s wanted or not. The videos included fan slideshows supported by the band’s music, tutorials on how to perform the song, and of course, covers. She categorized the facial expressions of the performers as well as fan memorabilia including t-shirts, concert posters, and Bono-inspired sunglasses.

Pennington says the A.I. software is getting better at identifying different textures and circular objects in photos and videos, but hopes her research can be a small stepping stone to identifying the emotional meaning behind those objects.

Facebook has signaled that it wants to create A.I. that’s “more perceptive than people” in order to better serve content and ads based on visual cues from videos and photos shared on the platform. Twitter is already testing its similar Cortex technology, which aims to smartly identify objects in live Periscope feeds in order to better recommend live video streaming events.

The same concepts could certainly apply to the continuous music streaming wars. Among the major music streaming platforms, it’s a race to develop the best predictive discovery software. Apple Music uses human curators to make playlists to compensate for the lack of consistency found in algorithmic programs used by its competitors Spotify and Google.

Ultimately, technologists like Alphabet executive chairman and Google founder Eric Schmidt, predict consumers of the future will rely on computer software to serve them music discovery options.

Google Executive Chairman Eric Schmidt speaks at the American Enterprise Institute March 18, 2015 in Washington, DC. Schmidt took part in a discussion on 'The Disrupters: Technology and the Case for Optimism.'

Win McNamee/Getty Images

Spotify and Apple Music tend to make bold claims about their number of monthly subscribers, but for now the king is YouTube, with more than a billion monthly users, that dominates the music streaming landscape.

YouTube Red and the YouTube Music app do a good job of serving up new and different options for music discovery, but it’s dragged down by its inability to actually identify what’s playing on the screen. Sure, Google knows which videos you gave a thumbs up to, watched 50 times on repeat, shared on social media, and commented on, but it doesn’t have the visual cues to tell it why.

Audiences could also generate search results based on the emotional feeling behind songs and their performers. For example, a search today for “Bad Day” may bring up the single by Daniel Powter, but in the future users might be served that music video alongside more tailored artists that simply evoke the emotion of having a bad day and not specifically that song. It’s also worth noting that YouTube’s search results are currently based largely on views, whereas likes, shares, and comments more directly affect the suggested artists on the desktop site or the playlist selection in the YouTube Music app.

Pennington’s research could be a step toward not only identifying what’s on our screens but also why bands such as U2 evoke such an emotional response.

Related Tags