Cats may have left theaters, but its memory will likely plague unfortunate viewers for years to come.
During the movie's several-month stint in national box offices, the stage-musical turned movie-musical featuring CGI, which featured anthropomorphic cats singing and dancing in an eerily deserted downtown London, garnered a wide-range of horrified reactions on social media -- from questionable feelings of bestiality to drug-fueled meetings with God. I do admit that during my screening, in which at least 80 percent of the audience was wearing cat ears, I felt significantly less horrified than the internet at large -- but that's beside the point. As a whole, people were disturbed. But maybe that's a good thing.
Earlier this year when the Cats trailer initially dropped Inverse explored what might be making viewers squirm at this film full of human finger and toed but fur-clad protagonists: the uncanny valley.
While the uncanny valley's origins are often cited back as far as the 1970s to roboticist Masahiro Mori in relation to human-like robots, the idea of the uncanny itself actually goes even further back to German and Austrian psychologists Ernst Jentsch and Sigmund Freud in the early 1900s. Jentsch first introduced the term in a 1906 essay entitled "On the Psychology of the Uncanny" and Freud elaborated on it nearly a decade later in his simply titled essay "The Uncanny."
Despite over half a century between these different introductions of the term, the psychological disturbance and relevance of this feeling are still incredibly similar: while Freud and Jentsch focus on human-like dolls and wax-works, the uncanny as we typically know it today focuses instead of human-like androids and CGI images. According to Mori, we are fine with robots that are either extremely unlife-like or that appear 100 percent human -- but robots or CGI images in between fall into the depths of the uncanny valley.
Associate Professor of Human-Computer Interaction at Indiana University, Karl MacDorman, tells Inverse that part of this unease stems from no longer understanding something we're infinitely familiar with, say, other humans.
"Out of everything in our environment, we are most intimately familiar with other human beings. When something is not right about one of them, in a way that is beyond our expectations, we experience a feeling of uncanniness," says MacDorman. "The feeling had typically been rare before the development of human-looking robots or animated characters. These entities press beyond human norms far more often that we ourselves do. This phenomenon is known as the uncanny valley because human beings and somewhat human objects like toy robots give us a feeling of affinity that we don’t get from most other objects. Yet when we try to create objects that look human, we lose this feeling of affinity, and they can appear eerie."
As Freud would interpret, the more aware we become of these entities' divergence from our expectations, the more we realize how thin the barrier is between what is "us" and what is "them" -- and as a result how close we are to not being "us" at any given moment. To protect our sense of self, feeling that something is uncanny is then in many ways a defense mechanism to keep us firmly grounded in our sense of reality and personhood.
Unlike the thrill of a horror movie, people don't usually seek out the psychological disturbance caused by the uncanny valley. Roboticists and motion graphics designers aim to avoid it as much as possible in their work. But dulling our own built-in reality alarm system might not be such a good idea after all, especially as we plunge head-first into the world of synthetic and manipulated media in the form of deep fakes.
"If anything, unreality has been democratized. You don’t need the resources of a Joseph Stalin to enhance faces in photographs and to remove or replace unwanted people."
"Traditional forms of propaganda can have extremely negative effects on humanity, as the world experienced from 1933 to 1945. Technology only amplifies these effects," says MacDorman. "The value society places on authenticity may also be in decline. Thirty years ago the discovery that the group Milli Vanilli were lip-synching created a furor, but with autotune and the comping of vocals, little of what we hear today in pop music could be said to be authentic. If anything, unreality has been democratized. You don’t need the resources of a Joseph Stalin to enhance faces in photographs and to remove or replace unwanted people."
While what constitutes "authenticity" in music is constantly up for debate, one thing is certain: it can be harder and harder to discern reality from fiction, and the increased sophistication of deep fakes isn't helping.
As the University of Albany's Director of Computer Vision and Machine Learning Lab, Siwei Lyu, tells Inverse, the number of ultrarealistic deep fakes is increasing every day and for the untrained eye, it's getting more challenging to tell a good fake from the real deal.
"The quality of deep fake videos is increasing by a lot," says Lyu. "One year ago, two years ago, we had a bunch of deep fake videos on the internet of low quality. Back then, it was not very good. But we're seeing a steady improvement in quality."
While Lyu tells Inverse that "bad" deep fakes, or poorly constructed synthetic images, can still invoke this feeling of the uncanny in their viewers thanks to visual artifacts left behind from the manipulation process -- such as a lack of blinking, blurry faces or mismatched features -- good deep fakes appear to have fine-tuned the process so well that these imperfections are nearly unperceivable without the trained inspection of an expert. What role this kind of massive-dupe plays in our ability to maintain our sense of reality is still largely unexplored, says Lyu.
"I don't think it's necessary or possible to eliminate fake videos."
But before we throw up our hands and surrender to living in a world devoid of object meaning, Lyu says there are ways we can still maintain our grasp on reality -- even if the uncanny valley has checked out.
"My personal view is that regulation will never work," says Lyu. "We can put up rules that say 'stop making these fake videos,' but people will get smarter. I think the best protection for us is actually to get the user and everybody aware of this problem. My analogy to this is basically like a flu epidemic. Deep fakes are kind of the epidemic of the internet. If everyone just watched a video and realized that this interesting video may potentially be a fake one, they make check the authenticity, they make check the source... I think that's the best protection. I don't think it's necessary or possible to eliminate fake videos."
Just as covering your mouth and washing your hands can help prevent the spread of disease, Lyu says that taking preventative and smart measures when it comes to consuming synthetic media can help prevent these fake videos from being more of an epidemic than they might already be. In this way, we can make up for our snoozing sense of the uncanny by practicing technology-forward common sense.
So, the next time you cozy up in front of a home screening of the Polar Express or Cats with a sense of unease, know that it's just your body trying to keep you safe.