Meta exec to a tired, hurt society: Stop hitting yourself
“If we took every single dollar and human that we had, it wouldn’t eliminate people seeing speech that they didn’t like on the platform.”
Andrew “Boz” Bosworth is gaslighting you. Speaking with Axios on HBO over the weekend, the VP of Augmented and Virtual Reality at Meta (née Facebook) doubled down on his company line that it bears zero responsibility for the spread of dangerous misinformation alongside the demonstrably deadly effects of its social media platforms. Instead, Bosworth argues that is wholly our fault for COVID-19 denialism, vaccine skepticism, and corrosive reactionary propaganda. What’s more, maybe we should take a good long look at ourselves before casting aspersions towards our benevolent techno-utopia providers at Meta.
“If your democracy can’t tolerate the speech of people, I’m not sure what kind of democracy it is,” Bosworth told Axios’ chief technology correspondant, Ina Fried, with a straight face.
Instead of admitting an iota of unintended consequences, Bosworth — and, by extension, Meta as a whole — continue to sidestep these concerns, focusing on the right to “free speech” while ignoring the painfully simple, obvious correlation between said speech and violent action. All this, even after it was revealed Facebook’s internal algorithms often amplify the most incendiary posts.
Divorced from any sense of reality — Bosworth’s comments and rationale are almost surreal to hear at this point. No critic is arguing against the right to free speech here, what we’re concerned with is the way certain speech is amplified, even encouraged, on platforms like Facebook, and how that has real-world consequences. “I understand the speech of people can be dangerous, I really do,” Bosworth offers at one point, “But that is what we are talking about — a fundamentally democratic technology.”
... Except it isn’t that. Not in the slightest. You can’t have a “fundamentally democratic” social media platform if its users’ feeds are constantly customized, augmented, and reinforced using heavily flawed AI algorithms.
A ‘What Can You Do?’ attitude — One of the interview’s most infuriating moments comes when Bosworth offers the ridiculous logic that, since the company can’t solve 100 percent of these issues, then there really isn’t much point in complaining about it in the first place. “If we took every single dollar and human that we had, it wouldn’t eliminate people seeing speech that they didn’t like on the platform,” he says, again conflating people getting offended with people dying from refusing COVID vaccines because some post told them they had metal shards in them. “It wouldn’t eliminate every opportunity somebody had to use the platform maliciously.”
The bottom line here is that it is absolutely clear by now that Meta has no intention of enacting meaningful reforms unless explicitly compelled to by federal oversight. Until (and if) that happens, expect its platforms’ damaging fallout to continue reverberating throughout society. What’s more, you can expect Meta to essentially keep slapping you in the face with your own hand while repeating, “Stop hitting yourself.”