In a bid to appear "edgy" online, there is always a contingency of trolls who attempt to shock and stun social media users with increasingly egregious content. It's nothing new. The posts, tweets, photos, and videos frequently mock minority groups while trivializing events from history that were, without doubt, heinous. So, it's not entirely shocking that TikTok has its own platform trolls who, according to the BBC, make anti-Semitic content for the masses. The development comes a little over a week after trolls hijacked a Pride event held by TikTok.
But what's concerning is that the company continues to lag in enforcing regulations and mechanisms that could nip the issue in the bud before it even goes viral. In this case, anti-Semitic video content has attracted at least six million views. How does a massive company with enough finances to create a full moderation department still stumble on such a rudimentary issue of hate speech?
Death camp humor — According to the BBC, TikTok has removed videos featuring deeply anti-Semitic content. One of the videos depicted a mechanic scorpion donning a swastika, which then killed multiple people. An anti-Semitic song accrued six million hits. Another clip cheering on a Nazi death camp in Auschwitz, Poland. Yet another video depicts a shooter game where green gas canisters are used to kill people. Jokes about Roblox characters in Nazi uniforms also pepper the platform.
Viral in no time — In less than three days, the content received mass likes while others selected pro-Nazi songs for their own videos. It took TikTok eight hours to remove the content from its platform.
A company spokeswoman told the BBC, "We do not tolerate any content that includes hate speech, and the sound in question, along with all associated videos, have now been removed. While we will not catch every instance of inappropriate content, we are continuously improving our technologies and policies to ensure TikTok remains a safe place for positive creative expression."
Rot in the TikTok algorithm — Transparency around the company's algorithm is a starter. TikTok, like most tech companies, keeps its algorithmic strategy under tight wraps. After all, public scrutiny seems to be the last thing it wants.
Of course, the platform algorithm likely boosts content that is being heavily engaged with — even if the content is stunningly racist and pro-genocide. Content consumption is then strongly influenced by tricky metrics like virality and popularity. It eventually reaches young and naive audiences — a demographic that is hooked on TikTok. Filtering and moderating still seem replete with problems.
These are just some of the problems plaguing the short viral video company. If TikTok is indeed invested in racial equity and safety for users like its press team constantly claims, it should start with a sincere overhaul of its content promotion system.