Facebook looking to limit amount of damaging disinformation during federal election

·3 min read

TORONTO — Facebook Inc. has unveiled its plans for the federal election to limit disinformation and targeted attacks that continue to be a problem on social media.

The tech giant said Wednesday that it has learned from past elections, including Canada's 2019 federal vote, and it has implemented new measures to secure the online accounts of candidates and is providing new training specifically to make its platforms safer for female candidates.

The company will also continue efforts to remove posts that violate community standards, including content that misleads people on how to vote.

"We will put all our efforts and energy into the next five weeks or so, to make sure we do all that we can to protect the platform from abuse and from bad actors," said Kevin Chan, Facebook's head of public policy in Canada.

The company has faced significant criticism in recent years for how the platform has allowed misinformation to spread online, including in recent U.S. presidential elections.

A report released in March by advocacy group Avaaz found that the company could have prevented billions of views of misinformation had it acted differently during the 2020 U.S. presidential election, while the group has also been critical of how the company has allowed misinformation around COVID-19 to spread.

“I would say historically their track record is not great, and we have seen this south of the border," said Ramona Pringle, associate professor at Ryerson University's RTA School of Media.

Social media platforms are still ripe for misleading people because news from established sources are mixed in with less reliable sources, she said.

“Because everything is slurried together in this one space, it makes people more inclined to believe the misinformation, or misleading information, because it appears side-by-side with more legit information as well.”

Facebook, however, says it has been active in removing misinformation posts that violate its standards, noting in a separate report out Wednesday that since the start of the outbreak it has removed more than 20 million pieces of content from Facebook and Instagram globally that violated its policies, and attached warnings to more than 190 million posts that may be false or misleading.

“Our job is to really to maximize voice, to allow as many people as possible, as wide a berth as possible, to express themselves in the way they want to, but it’s not without limits," said Chan.

Misinformation has also not been as prevalent an issue in Canadian elections than in the United States. A 2020 report from the Digital Democracy Project found that while disinformation was shared during the 2019 election, it "generally did not appear co-ordinated and had limited impact."

Facebook users will also see fewer posts related to politics in general this election after the company instituted a policy in February to reduce the prominence of posts related to civic issues. The company said it implemented the policy based on requests from users.

Users can also choose to limit the number of ads in their news feed, and Facebook says it is providing more transparency and vetting over political ads.

The company says it is working with Agence FrancePresse and Radio-Canada as fact checkers for posts in Canada, and has 35,000 people working on safety and security to help ensure the integrity of elections globally.

This report by The Canadian Press was first published Aug. 18, 2021.

Ian Bickis, The Canadian Press

Our goal is to create a safe and engaging place for users to connect over interests and passions. In order to improve our community experience, we are temporarily suspending article commenting