Local and national news outlets — not to mention parents and grandparents on Facebook — are freaking out over the “Momo Challenge,” but the most frightening thing about the flare-up, says one internet hoax expert, isn’t the creature with huge eyes and a creepy smile.
The “Momo Challenge” started with YouTube videos and WhatsApp messages in which anonymous strangers on the internet give children a series of increasingly disturbing instructions, including telling them to hurt themselves.
The instructions are allegedly accompanied by gory images, with the threat that the recipient will see more and worse images if they fail to obey, all behind an avatar of a creepy bird lady with huge eyes. It’s reminiscent of the Blue Whale Challenge, which resulted in at least one suicide in 2017. But in this latest case, there’s a host of conflicting information circulating about the Momo Challenge.
While YouTube and WhatsApp have issued statements saying they care deeply about the safety of their users and are doing everything they can to keep their users safe, they have also suggested that they don’t have any evidence of any children being harassed by Momo
It’s not clear whether the phenomenon is actually happening or just receiving a signal boost from media coverage. Some prominent news outlets have warned that the Momo Challenge poses a serious threat to kids everywhere, while others are dismissing the hubbub, decrying it as a hoax.
But here’s the thing no one’s saying about the Momo Challenge: It doesn’t matter whether it’s “real” or not.
Brooke Binkowski, managing editor at Truth or Fiction, a fact-checking site, and the former managing editor of fact-checking site Snopes, tells Inverse that she’s usually quick to call out hoaxes and urban myths.
But Binkowski doesn’t think Momo is a hoax, exactly. Even if there’s nobody using the seriously creepy face to abuse children, the Momo Challenge news sheds light on something very real.
“Is Momo dangerous? I wouldn’t say so, but I wouldn’t call it a hoax, because there are truly creepy issues lurking in YouTube channels that are supposed to be vetted for children,” says Binkowski.
“Perhaps Momo is the face of the parental fears that are finally beginning to emerge as we all start to realize how corrosive some of the disinformation catering to all ages and creeds on the internet truly can be to a healthy society.”
Even if people seeking to harm children aren’t hiding behind the Momo avatar — which is, by the way, just a photograph of Keisuke Aisawa’s sculpture called “Mother-Bird” — there are well documented instances of people inserting disturbing content into YouTube videos specifically aimed at children, including videos of cartoon characters engaged in acts of violence and giving instructions on how to hurt themselves.
In this way, Momo serves as the perfect mascot for a lack of proper content moderation on social media, as well as its consequences, all coming to a boil. The worst of it, Binkowski suggests, may actually be getting ignored among the Momo talk.
On Thursday, Buzzfeed News reporter Jane Lytvynenko wrote that there haven’t been any verifiable cases of self-harm or suicide linked to Momo. Nonetheless, this episode of internet weirdness serves as the latest symbol or avatar for something dark that has been bubbling below the surface for a long time, something that is arguably baked into the substance of platforms like YouTube: people using the poorly moderated platform to expose children to harmful content.
On Sunday, The Washington Post’s Lindsey Bever reported on kids’ YouTube videos that have literal suicide instructions spliced into them. These animated videos — nightmarish in their loud colors and terrible production value, but more or less harmless on their own — are interrupted by a spliced-in clip of YouTube character Filthy Frank telling kids how to do the most harm when cutting their wrists.
This trend, Binkowski says, is “the real monster.” After users reported the videos, YouTube started taking them down, but this case wasn’t the only dark trend seeping through the cracks in YouTube’s moderation.
Meanwhile, YouTube has also begun cracking down on pedophiles who used YouTube to trade photos of children. As The Guardian’s Alex Hern reported on Thursday, the site has disabled comments on videos of children in an effort to get out ahead of the trend, in which people provide timestamps on videos where children are exposed in various ways — for instance, when their underwear is visible.
So even as news outlets continue arguing over whether or not Momo is a hoax, Binkowski argues that these other cases of poorly moderated content on YouTube, the disturbing, abusive content that targets children, all highlight the fact that Momo is simply giving a face to parents’ fears.
“I think it’s functioning really well as a motif or a symbol for the horrors that lurk just below the surface of the internets,” she says.
Furthermore, the videos with suicide instructions — which are verifiably real — represent YouTube’s failure to properly moderate the content on its massive platform that is supposed to be appropriate for children. And when the harmful content is inserted into the middle of videos, it’s easy to fly under the parental radar, too.
“All you need to do is have five minutes of cartoon characters looking legitimate, and then the parents will go do their thing and suddenly the kids are watching the right way to slit their wrists,” says Binkowski.
Among all the confusion and controversy, YouTube’s responses ring somewhat hollow. Instead of addressing the systematic problems that make it so easy for people to abuse the platform, they’ve zeroed in on the specifics of each case. And worse, the company doesn’t even seem to be willing to admit that there are problems.
From The Atlantic’s article on the Momo Challenge:
YouTube confirmed that, contrary to press reports, it hasn’t seen any evidence of videos showing or promoting the “Momo challenge” on its platform. If the videos did exist, a spokesperson for YouTube said, they would be removed instantly for violating the platform’s policies. Additionally, there have been zero corroborated reports of any child ever taking his or her own life after participating in this phony challenge.
Buzzfeed News has identified multiple instances of Momo videos, though, making this response ring as disingenuous at best. Whether or not the Momo Challenge is real doesn’t matter as much as how YouTube is handling cases of abuse on its platform. Binkowski argues that proper moderation could have helped YouTube head off this type of abuse before it got to the point it’s at today: so widespread and pervasive that it’s nearly impossible to police.
“What I think unites all of these scary videos is not that there is one single person or entity behind them because I don’t think that’s the case,” she says. “But what does unite them is the fact that all of this is totally predictable and could have been prevented years ago with appropriate moderation, not outsourced to untrained, unprepared people for pennies on the dollar.”
So is the Momo Challenge real? Who cares. Sure, Momo is creepy, but perhaps the scariest part of this viral phenomenon is how it’s distracting from the real harm that is happening across the internet.
Email the author: email@example.com.