Streamers on Twitch have been rallying across social media under the #TwitchDoBetter hashtag in order to demand better moderation tools. The platform is predominantly used for streaming video games and has been known as a place where marginalized individuals are frequently targeted with hateful messages.
As an example, streamer Safire Glamour shared a recording on Twitter which shows users flooding their stream with comments that read “ALL THE GAYS SHALL DIE” in rapid succession.
Basic accountability — Twitch, owned by Amazon, has said that it would work harder to enforce its rules against hate speech and harassment. This new rally coalesced around a 2018 tweet from the company in which it said, “Please watch us closely and hold us accountable.” Some feel it’s been too long without material results. Users are left to do the moderation work themselves and argue that Twitch shouldn’t take its normal 50 percent cut of revenue so long as that’s the case.
“Many marginalized creators find it better to turn off the currently available Twitch tools and just have their own community moderate their spaces,” said one streamer speaking to Kotaku. “If we are doing all this additional work, why is Twitch taking such a large portion of the profit we generate?”
Amazon doesn’t break out Twitch’s revenue, but a report earlier this year says it expected to make $500-600 million in revenue for 2020. Gaming could be a big revenue opportunity for Amazon, which is also trying its hand at a cloud gaming product. But like other platforms, Twitch didn’t spend much time in its early days thinking about how it would keep its community safe from hateful speech.
With billions of minutes streamed every year on Twitch, the company has tried using automated tools to combat harassment, but abusers find workarounds, like replacing letters in a word with symbols. Human moderation could address that, but hiring such workers is expensive and tech companies would prefer to invest in improving artificial intelligence filters.
Some of the requests made by #TwitchDoBetter campaigners would be quite easy for Twitch to implement. Accounts are easy to create, and Twitch could make the process more difficult by requiring new users to add a phone number for verification. It could also limit the age of accounts that can comment, so users who were banned can’t quickly come back and continue posting hateful messages.
Diversity — Gaming has long been plagued by sexism and harassment, as the industry has historically directed its marketing at male audiences and men direct hateful slurs at women they feel threatened by. Women actually spend more money on gaming than men, but the type of gaming that’s not predominant on streaming platforms — think mobile games. That type of situation won’t change, and more women won’t come into the fold if they are constantly being harassed. It’s in Twitch’s best interest to try and foster a more diverse and inclusive platform, then.
Other companies including Microsoft, Sony, and Nintendo have launched campaigns in recent years to address the issue of harassment and making their respective platforms more welcoming.
“The answers are there and it seems like they can be done without breaking the bank,” the streamer Rek It, Raven! told Kotaku. “There needs to be conversation with people at Twitch to really involve people affected so we can come up with solid conclusions for change.”
It would cost money for Twitch to staff up human moderation, but the company should be expected to balance profit with safety.