TikTok’s new expert 'content advisory council' hopes to rewrite the app’s problematic moderation processes

Is TikTok making progress here or just saving face?


TikTok, the problematic favorite app of millennials and Gen Z, announced today the creation of a “content advisory council” to guide its moderation practices. The council will be composed of technology and safety experts from “a diverse array of backgrounds and perspectives” who will develop content policies for TikTok’s future.

The company’s press release quotes various lawmakers and future TikTok content advisory council members in praising the initiative. Everyone involved seems to agree that the creation of the council displays TikTok’s dedication to keeping its platform safe and fun for all users.

It’s commendable that TikTok is putting resources toward better honing its moderation policies for the future. But that isn’t enough to rectify the secretly discriminatory moderation practices the company has utilized in the past.

A long time in the making — Based on previous whispers from TikTok and others, its content advisory panel has been in the works for quite a while. TikTok mentioned the project abstractly in October as part of its future-forward moderation strategy, and it’s since updated its community standards and begun the creation of a transparency center.

The timing here is impeccable — Two days ago, The Intercept published leaked documents that point to some very concerning moderation practices at TikTok. The documents include instructions for suppressing entry into the app’s “For You” page on the grounds of users being “chubby,” having “ugly facial looks,” or scenery that is deemed “shabby and dilapidated. The documents essentially reveal that TikTok prioritizes the aesthetics of its curb appeal over its users’ experiences.

TikTok has not yet responded to these serious allegations.

But this is still a good idea — Yes, this council was likely created with some intent to save face in the midst of these very unsavory allegations. That being said, TikTok’s continued rise to power has come with obvious struggles to manage its growing user base. A measure like forming an external moderation council — which is relatively extreme compared to how other social networks handle the problem — could be just the solution to TikTok’s continued problems.

TikTok is making an obvious effort to redirect its moderation practices toward making TikTok — as it mentions in its press release — “lighthearted, real, heartwarming, and truly fun.” But the company will still need to publicly confront its problematic past before users can really trust it to be a safe space.