The iPhone's adult content filter is receiving criticism and scrutiny after reports from the Independent and Gizmodo that the filter blocks the term "Asian" no matter the context, and doesn't extend the same over-reaching approach to other terms. The filter has no problem with terms like "Black," "White," "schoolgirl," "Japanese," "Korean," "teen," and, well, you get the idea. The issue was first spotted by iOS developer Steven Shen, who filed a report about the problem with Apple in December 2019.
Apple's "Limit Adult Websites" filter, which forms part of the iOS Screen Time settings menu, is intended to keep younger iPhone users from accessing adult materials online, whether deliberately, or inadvertently. It's not meant to erase an entire continent and its innumerable sub-cultures.
Bad no matter who's to blame — The specific targeting of the term "Asian" is problematic whether it's deliberate or accidental, and due to a machine learning error or a human one. After all, "Asian" here could be applied to search entries for Asian food, Asian literature, Asian sports, Asian philosophy, Asian restaurants, or other totally wholesome content. When a user tries these entries, they get an error message that reads either, "You cannot browse this page at 'google.com' because it is restricted," or, "The URL was blocked by a content filter."
Here's what might have happened — One theory is that Apple's filter system is driven by artificial intelligence, and as we've learned in cases from facial recognition to job applications, artificial intelligence is far from perfect and is prone to overcorrecting. So, this could have been a mistake as a result of scraping heaps of search results that included "Asian" somewhere in them. If that's the case, it's still a problem, and one that should have been spotted, flagged and rectified.
But, if this filter application was the decision of a human being, well, that's even worse. Whatever the reason, Apple needs to seriously scrutinize its content filtering methods.