Facebook’s 7 "Hard Questions" are Actually "Easy Questions"

Zuckerberg wants your input on his list of moral dilemmas. 

Getty Images / Paul Marotta

Facebook, the internet’s best place for expressing bad opinions, decided on Tuesday that it should reach out to its community of users for help answering some of the most difficult dilemmas it faces, like the question of stamping out fake news and stopping actual terrorists. Most of its seven “hard questions” are actually pretty easy, though.

On Tuesday, Facebook’s Vice President for Public Policy and Communications, Elliot Schrage published this blog post called “Hard Questions,” as Facebook tries to codify its relationship with the ugly side of human life that’s always present on its network.

Screenshot of an ISIS-affiliated Twitter account. 


1. How should platforms approach keeping terrorists from spreading propaganda online?

This is is something that’s already been tackled by other platforms. Twitter has done this surprisingly well in the case of ISIS (although it fails at dealing with everyday abuse). In 2016, a West Point study found that the Islamic State’s media presence had significantly declined on most social media channels, although it flourished on encrypted Telegram.

There is a grey area when it comes to politically controversial groups like Hamas or various Kurdish groups in Turkey, which are classified as terrorist organizations by some governments but also hold political office.

Facebook, for its part, has now somewhat addressed this issue, getting into the technical details in a separate blog post published Thursday.

The Facebook page for Dee Gyp Blancharde, who was killed by her daughter, Gypsy Blancharde.


2. After a person dies, what should happen to their online identity?

This also seems to be a pretty easy fix, and Facebook is already on the right track. The service should have users name a “caretaker,” to which access to the account passes in a limited form after someone has died. Again, this can get complicated in exemplary cases, like the above profile for Dee Dee Blancharde, a woman who forced her daughter to fake illnesses for nearly her entire life, until the daughter, Gypsy, murdered her. Gypsy accessed her mother’s Facebook page (Dee Dee’s photos were of Gypsy), and posted “That Bitch is dead!” after murdering her. It’s unclear why Dee Dee’s profile still exists in its current state.

3. How aggressively should social media companies monitor and remove controversial posts and images from their platforms? Who gets to decide what’s controversial, especially in a global community with a multitude of cultural norms?

This question hinges on the definition of “controversial” — while much of society can agree that stigmatizing the female nipple to the point where breast cancer organizations can’t share awareness advertising is counterproductive, “controversial” means different things to different cultures and people. One solution could be to put content filtering back in the hands of the user, offering a set “parental controls” if you will.

4. Who gets to define what’s false news — and what’s simply controversial political speech?

Facebook will probably waffle over this constantly. But the company should just ban or flag websites that are known to host fake news articles. “Controversial political speech” doesn’t really have anything to do with stories that are demonstrably false.

IRL protests often use online hashtags. 

Getty Images / Jack Taylor

5. Is social media good for democracy?

The blog post on this one is going to be interesting. Facebook is going to say “yes, the product that we provide is great for an egalitarian political system!” but the real answer is probably something different. Social media, in theory, is probably great for democracy. The problem is, of course, that Facebook’s motivation isn’t to promote democracy, it’s to make money — despite Mark Zuckerberg’s unofficial presidential campaign).

6. How can we use data for everyone’s benefit, without undermining people’s trust?

The answer here is also relatively simple: tell people what data you’re using and why.

7. How should young internet users be introduced to new ways to express themselves in a safe environment?

Parental controls, maybe, could be another help here, giving young users a slightly more restrictive or limited Facebook experience (letting parents with With the amount of data and control Facebook has, it shouldn’t be hard to find a way to minimize the potential damage young social media users

Related Tags