Facebook and YouTube can't stop a girl's suicide video from spreading

Katelyn Nicole Davis, a 12-year-old girl, took her own life last month.

Two weeks later, people are experiencing it again and again. Headlines from last week that drift into this week say that the girl "streamed her own suicide on Facebook" (Heavy.com) and "streams her own suicide on Facebook Live" (TheSun.com).

But it wasn't on Facebook Live. The more than 40-minute video of the girl hanging herself originated on an app called Live.me, which is a mobile app for live-streaming, popular among teens. The video remained up on Live.me and was later uploaded and circulated on Facebook and YouTube, as well as embedded across different websites.

While it's difficult to determine the origin of the confusion, it's evident that Facebook is a powerful distributor of video — especially for these live moments. Facebook has been making it easier to create and distribute video, most recently by adding automatic captioning and introducing live video to desktop.

But clearly the company is not monitoring effectively nor being explicitly clear about their guidelines. Even law enforcement that have been contacted about the video have expressed frustration at being unable to get the videos taken down.

On Thursday, Mashable identified a video of the suicide that was uploaded on Facebook four days prior. The video contained a graphic video warning, which Facebook attributes to videos that it believes meet its Community Guidelines but should be viewed at discretion.

Image: facebook screenshot

Kyle MacDonald, New Zealand-based psychotherapist, had viewed a version of the video and reported it to Facebook. He received this response earlier this week: "We’ve reviewed the share you reported for showing someone injuring themselves and found that it doesn’t violate our Community Standards," he wrote in The Guardian.

However, Facebook told Mashable Thursday that such a video should have been removed from the website.

Facebook isn't alone in its struggle to prevent videos that violate its standards from being uploaded and sharing. Mashable found several videos that included clips of Davis's final moments. A YouTube spokesperson said the site prohibits content that show the moment of a death for the suicide of a minor.

As Mashable previously reported, Facebook videos must be taken down by a human moderator and are not automatically removed with technology.

Facebook's guidelines require a human to put such "graphic video" warnings before user-uploaded videos. This would mean the warning label that was attributed to the above video was added by a human. The video has since been removed.

Still, Facebook's system is not working. "We want it down as much as anyone for the family and it may be harmful to other kids. We contacted some of the sites. They asked if they had to take it down and by law they don’t. But it’s just the common decent thing to do in my opinion,” Polk County Police Chief Kenny Dodd told Fox News 5 in Atlanta, Ga.

Facebook declined to say if the company has been in contact with Dodd or other police officials. The company said, repeatedly, to Mashable that it did not host the original video.

The app that hosted the original stream, Live.me, has reached out to Facebook and YouTube to help remove the content. A Live.me spokesperson told Mashable Thursday in an emailed statement:

Facebook declined to comment about the situation on the record.