Ahead of its participation in today’s Senate hearing on child safety, YouTube revised its children-focused monetization policies to specify that only high-quality content will be allowed to make money on the platform. Only channels that specifically target younger audiences or are classified as “made for kids” will need to fit the new standards.
YouTube says it has reached out to creators who may be impacted by this policy change to assist them in changing their content ahead of the policy change. Kids' channels that produce “low-quality” content could lose their place in the YouTube Partner Program (YPP) and may be demoted in search results.
In announcing these changes, YouTube makes it clear that these new policies are being implemented with kids’ well-being in mind. In fact, the company has gone so far as to have education experts comment on YouTube’s policy changes, presented as enormous, colorful doodles. There’s a definite push here for this to be seen as totally altruistic — but YouTube has its own interests in mind here, too.
What constitutes low-quality? — YouTube’s new guidelines for kids' content aren’t quite as cut-and-dry as most content creation rules. Rather than putting finite definitions on what kids' content is or isn’t allowed to make money, YouTube has opted for some seriously vague terms here. “Low quality” and “high quality” are terms usually reserved for subjective value judgments.
In some sense, the judgment of a video’s quality will be subjective, as YouTube has the final say here. The company has provided some general guidelines for videos considered high-quality, though, including “being a good person” and “interaction with real-world issues.” Overall, YouTube is stressing that kids’ content shouldn’t just be pushing products or encouraging bad behavior.
All about the kids…right? — YouTube is generally a bit sluggish in banning harmful content, so this proactive move is something of a novelty for the company. The motivation for this large-scale policy change can be found in the timing: YouTube has been swept into a much broader conversation about how internet companies often fail to protect their youngest users.
This isn’t a new conversation, but it’s one that’s escalated rapidly in recent months. The planned creation of an Instagram Kids app brought new attention to the question of when, exactly, companies should be allowed to use the internet to prey on children. Facebook’s inability to answer basic questions about how it protects children magnified the contentious issue to an unprecedented scale.
YouTube is facing similar pressure from lawmakers to tighten its policies to better protect children. The company has already been doing so for some time, but most focus on restricting access to parts of YouTube’s catalog or providing parental controls. Threatening demonetization is the biggest move thus far YouTube has pointed at kids' content creators.
The new policy’s vague quality judgments could make it difficult for creators to navigate what, exactly, they’re allowed to upload. But hey, as long as these vague terms get Congress off its back, right?