Culture

TikTok’s new anti-bullying tools just mean more work for creators

The short video platform has a problem with bullying in comments. Now it’s adding new tools to try and combat the problem.

Close up focus on mobile phone is on pillow, frightened woman lying in bed looking on smartphone fee...
Shutterstock

From YouTube to Facebook to the threads under almost any news article or blog post, comment sections can be brutal. TikTok, the short video platform owned by ByteDance, is not immune to this problem. On Wednesday, the company announced its efforts to bolster comment quality on the platform by giving users more control over what goes up and what doesn't in their comment sections.

"At TikTok, we continually work to maintain a supportive environment that enables our community to focus on what matters to them: being creative, finding community, and having fun," TikTok's policy director Tara Wadhwa said. "Part of this fun is engaging with content, sharing ideas, and connecting through comments."

It comes down to two new features on TikTok, both involve getting the content creator and viewer to be more mindful on the network. And both are fundamentally flawed.

Giving the creator more work — One of the new features lets TikTok content creators weigh in on every comment under their videos. This option is called "Filter All Comments'' and doesn't allow comments to show up under a video unless the creator approves them through the comment management feature on the app.

TikTok already has spam and offensive comment filters, as well as a filter for specific keywords. But this is both more sweeping, and a bigger ask of users.

TikTok

The problem here, of course, is that it means more work for creators, and it means they still have to contend with hateful comments. Turning all comments off, however, could affect engagement and their popularity on the platform, so simply ignoring them isn’t really an option.

A chance to reassess your words — TikTok doesn't want to put the onus of healthy engagement on content creators alone. So its second feature focuses on the audience. Through this particular tool, TikTok users will be asked to "reconsider" their comments if the rhetoric is unkind, inappropriate, and potentially violates TikTok's community guidelines. "If you change your mind," the prompt will read, "you can edit your comment." Twitter has tried a similar mechanism to encourage self-censorship, with mixed success.

TikTok’s track record isn’t great — TikTok’s new emphasis on anti-bullying is in stark contrast with Intercept’s report which found that TikTok told moderators to curb the reach and promotion of content from "ugly," "poor," or "disabled" users of its services, and those with backgrounds that were "shabby and dilapidated," less they tarnish TikTok’s image.

When the company itself has displayed a disparaging attitude towards whole categories of users, its pledge to fight bullying feels like posturing. Especially when, instead of real moderation, it places the responsibility for filtering objectionable material on creators themselves and only half-heartedly tries to discourage vitriolic comments, a move that could result in more harassment, not less.