On August 31, a Mississippi man named Ronnie McNutt reportedly committed suicide by shooting and streamed it on Facebook Live, forcing friends and family to be the unwitting witnesses to an unspeakable, violent tragedy. It should not have been possible and yet, it was — and by nature of social media, the horrific footage has snowballed across the internet to reach just about every major platform.
Outside of cesspools like Hoodsite and 4chan, the video has since made its way from Facebook to Twitter. And TikTok. And Instagram. Even Snapchat stories. Users on each site have posted warnings urging others to report it if it crosses their paths and not to, by any means, watch its contents. Of course, the curiosity evoked by posts of that kind has gotten the better of some, much to their regret, while others have stumbled upon it entirely by accident.
The consequence of that is trauma — real, genuine, difficult-to-shake trauma. Were this the first such incident, it might be excusable that Facebook didn’t catch this video right away and do everything in its power to prevent its circulation. But it’s not; it’s far from that.
You’ve had enough chances — Input has reached out to the aforementioned platforms for comment on their responses to the video. Social media sites have been trying for years to get violent content under control, especially since the introduction of live features that are harder to monitor in realtime. Facebook Live has seen numerous suicides and murders since it first rolled out, and the company has been working on algorithms and reporting methods to effectively screen for this type of content since at least 2017. Somehow, though, Facebook still hasn’t figured it out.
On TikTok, where auto-playing videos are everything, something like this is much harder to avoid once it starts gaining traction.
Some users have noted that they saw the video itself on their feeds early on, while others are now reporting that the platform is filled with warnings about it. TikTok appears to be on top of the issue. In a statement to Input on Monday, a TikTok spokesperson said:
On Sunday night, clips of a suicide that had been livestreamed on Facebook circulated on other platforms, including TikTok. Our systems have been automatically detecting and flagging these clips for violating our policies against content that displays, praises, glorifies, or promotes suicide. We are banning accounts that repeatedly try to upload clips, and we appreciate our community members who've reported content and warned others against watching, engaging, or sharing such videos on any platform out of respect for the person and their family. If anyone in our community is struggling with thoughts of suicide or concerned about someone who is, we encourage them to seek support, and we provide access to hotlines directly from our app and in our Safety Center.
This is not entertainment, it’s real life — The internet has a way of both deepening our connections to others and of stripping away our sensitivities. Violence becomes throwaway content in the neverending flow of material, there one moment and forgotten in the next.
But these are real lives, not just some Netflix drama you can consume and obsess over and then move on from. It is not our place to pick apart the motivations of a person — a stranger, to most of us — who felt enough pain that it led them to take their own life, however public that act may have been. There is no justice to be brought to the grieving family by keeping such videos afloat, only more suffering.
It is entirely within our control to not be voyeurs to this type of horror, and stand firm against the traumatization and re-traumatization of people already in mourning.
If you or someone you love is struggling with thoughts of suicide or self-harm, there is help available. The National Suicide Prevention Lifeline operates 24/7 and can be reached at 1-800-273-8255. You can also text Crisis Text Line at 741741.