A video that shows a man shooting himself with a gun began circulating on TikTok this weekend, and the company has been racing to keep it from spreading further. But some users say the disturbing clip is still popping up in their feeds, sparking concerns about the efficacy of TikTok’s moderation systems.
A TikTok spokesperson confirmed to Forbes that “clips of a suicide” had been showing up on the platform Sunday evening after the video was originally livestreamed on Facebook. (Note: We’ve reached out to Facebook to confirm this, and will update with their response). Several TikTok users began sending out signal boosts on Sunday and Monday warning their followers about how the clip begins—with a man with long hair and a beard talking on the phone—so that they can swipe away before it turns gruesome. As the Verge notes, some users claim they’ve come across the clip hidden inside completely unrelated TikToks as well.
While the video violates TikTok’s guidelines, the company’s been scrambling to contain the situation. Likely because TikTok—unlike other social media platforms like Facebook and Twitter—heavily relies on its algorithmically generated “For You” discovery page to fuel engagement, which makes it more difficult to keep users from unwittingly stumbling across any single video.
“Our systems have been automatically detecting and flagging these clips for violating our policies against content that displays, praises, glorifies, or promotes suicide,” a TikTok spokesperson told the Verge. “We are banning accounts that repeatedly try to upload clips, and we appreciate our community members who’ve reported content and warned others against watching, engaging, or sharing such videos on any platform out of respect for the person and their family.”
“If anyone in our community is struggling with thoughts of suicide or concerned about someone who is, we encourage them to seek support, and we provide access to hotlines directly from our app and in our Safety Center,” the spokesperson added.
While several social media platforms have had their woefully inadequate moderation practices called into question in recent weeks, TikTok’s oversight in this case is especially egregious because nearly the exact same thing has happened before. In February, a 19-year-old livestreamed his suicide on the platform, and TikTok reportedly waited nearly three hours after learning about the incident before contacting the authorities, according to internal documents viewed by the Intercept.
That’s already horrifying enough, but even more so when you remember that roughly a third of its 18 million users are purportedly under the age of 14. With President Donald Trump threatening to ban the app, several companies are vying for control of TikTok’s U.S. operations at the moment, from Microsoft to Twitter to Walmart, so there’s enormous pressure on TikTok to get its shit together and, at the very, very least, find a way to keep from triggering children with violent imagery.
Finally, because I know this subject matter can be triggering in itself, please reach out using the resources below if you or anyone you know is considering suicide, struggling with anxiety or depression, or just needs to talk:
- The National Suicide Prevention Lifeline: 1-800-273-8255
- The Trevor Project: 1-866-488-7386
- For information on the International Association for Suicide Prevention’s crisis centers, click here.