WARNING: This story contains graphic details.
On Saturday afternoon, Hoda Awale was scrolling through her Twitter feed when she came across news of a shooting outside a mall in a Dallas suburb.
Almost immediately, she was subjected to graphic images of the bloodied victims, some of whom were children.
The video began playing automatically, as per the settings of her account.
"There was nothing to indicate that the video would be something as horrific as I saw," said Awale, who works for a public health nonprofit in Seattle.
"Especially since it was children, it really did traumatize me in that moment."
In all, eight people were killed and seven others were wounded in the parking lot of the outlet mall in Allen, Texas, before the gunman was shot dead by a police officer.
In an effort to prevent others from inadvertently seeing the video, Awale tweeted a screenshot of the first moments, warning of what was ahead.
Awale was among a number of Twitter users who criticized the platform for not immediately taking down the video, or at least not adding a warning notice. She, like many, has also disabled Twitter's video autoplay function — an automatic setting that predates its ownership by Elon Musk.
As the video circulated on the platform, a debate played out over the limits of freedom of expression and the power of images to both cause harm and provoke change.
The power of video
In response to Alawe, one Twitter user argued the images should be widely seen to force U.S. lawmakers to finally make changes to the country's gun control laws.
"The world needs to see what's happening here. Everyone should watch it," they wrote.
Another questioned the value of censoring the video: "Why? So we can continue to live in denial?"
There is a long history of disturbing photographs and videos sparking a social change, such as the recording of the killing of George Floyd and the Black Lives Matter movement that followed, said Prof. Heidi Tworek, director of the Centre for the Study of Democratic Institutions at University of British Columbia.
The question, she stressed, is whether "this video really falls into that category, or whether it becomes a gratuitous display of violence, whether it's disrespecting victims, whether it's potentially inspiring copycats."
James Turk, director of the Centre for Free Expression at Toronto Metropolitan University, pointed out that similar debates have been playing out in traditional news media for years and show no signs of abating.
"I don't think there's a right answer," he said.
Either way, the video appears to be a violation of Twitter's terms of service — which state that "excessively gory content" is not permitted. The company's rules also say media is prohibited if it has "the potential to normalize violence and cause distress to those who view them."
The video, published by multiple accounts and viewed an untold number of times, was still viewable on Twitter midday Monday. It was not easily searchable on other social platforms, such as Facebook or YouTube. (CBC News has decided not to air the video).
Musk has described himself as a "free-speech absolutist," but it's difficult to say whether allowing such footage to circulate was deliberate, or perhaps the result of recent cuts to the company's content moderation team.
The company did not return a request for comment Monday. CBC News received a poop emoji as an automated email reply.
Facebook declined to comment, while a spokesperson for YouTube said it "quickly removed violative content" and is ensuring people are "connected with high-quality information when searching for details about this tragic incident."
Legislating online harm
Awale, who has been a Twitter user for more than a decade, said she was troubled that the video — which she described as "trauma porn" — had still not been taken down.
At the very least, she said, there should have been a content warning on the video so that people could make the decision for themselves.
In Canada, such considerations are likely to be part of forthcoming federal legislation regulating online harm on social media platforms.
WATCH | Violent viral video: the debate reignited by the Texas mall shooting:
Tworek was part of an expert group consulted on the issue in 2022. A proposed law, Bill C-36, died on the order paper that February New legislation is expected in the coming months.
"This is not the first time we're having these kinds of discussions about images and videos," she said. "This is clearly a systemic problem about processes within platforms."
She added that, by now, even if removed from Twitter, the video will have been downloaded countless times and reposted elsewhere on the internet.
According to Turk, the legislation will be challenging for the government, given the evolving nature of online content and the differences of opinion over how it should be regulated.
"In my view, the only limits on what should be restricted by the state is that which is illegal," he said, referring to content such as hate speech or advocacy of violence.
"In terms of conveying graphic material about a shooting or about a war, I'm not sure it's appropriate to put a ban on it."