Minecraft needs to invest in more hate moderation, ADL study finds
The Anti-Defamation League studied three months of anonymized data to analyze how Minecraft treats hate speech.
The Anti-Defamation League (ADL) is recommending that Minecraft invest in its content moderation efforts and in creating more robust community guidelines after the organization studied three months of anonymized chat data from the game. The study, which was conducted in collaboration with Take This, the Middlebury Institute of International Studies, and GamerSafer, focused specifically on how Minecraft deals with hate speech.
The researchers chose Minecraft not just because it’s popular (though it certainly is, with 141 million active players) but also because, as the ADL puts it, the “decentralized, player-run nature of Minecraft Java edition provides a novel opportunity to assess hate and harassment in gaming spaces.” And though the ADL’s findings are specific to Minecraft, the recommendations do resonate across most online spaces.
Just a sample — Minecraft is massive, with many, many servers in which players congregate, so it’s impossible to really study the entirety of that universe. As such, researchers focused on three servers of varying sizes and audiences:
- Server 1 included about 20,000 players, mostly 14 to 18 years old, with strict rule enforcement and a player-to-moderator ratio of about 464:1. This server, as the most active, comprised about 94 percent of the study’s data.
- Server 2 had around 1,000 players, on average 15 to 20 years old, and only two moderators with little to no moderation enforcement.
- Server 3 had just 400 players, with an impressive 41:1 player-to-moderator ratio and extensive and active moderation. Most players here were at least 16 years old.
Using GamerSafer’s plugins, researchers were able to track 458 disciplinary actions against 374 different users. The researchers analyzed both these formal reports and a textual analysis of chat logs to locate patterns and unreported instances of hate speech.
Bans work — The researchers found that, for the most part, temporary bans — one of the moderators’ main tools to keep players in check — are effective in reprimanding bad behavior. Temporary bans were used as a moderation technique in 46 percent of all studied instances.
In the three months of logs studied here, a whopping 40 percent of all formal reports were filed for “hacking” (using banned advantages); 16 percent were filed for harassment, another 10 percent were filed for hate, and 9 percent were filed for sexual misconduct. Of the 1,463,891 messages studied, 2 percent were categorized as severely toxic; 1.6 percent were categorized as sexually explicit; and 0.5 percent were categorized as hateful (many of which targeted sexuality and gender).
Time to invest — The ADL’s conclusions are succinct: human moderation works, and Minecraft should invest more resources in said moderation. Gaming servers with more moderators and stricter guidelines turned up the fewest incidents of hate and harassment. The ADL also recommends that Minecraft improve researcher access to data so moderation efficacy can be studied with even more insight.
On a broader level, the ADL is recommending that the gaming industry as a whole standardize its moderation reporting practices. Doing so would allow for more significant research into how hate speech and harassment can be minimized in gaming spaces.