Child luring and exploitation through Snapchat is on the rise. Here's what you should look out for

Law enforcement and government officials are concerned about the increase in social media apps like Snapchat (the messaging app is pictured on a phone screen) being used to lure and sexually exploit teens.  (Thomas White/Reuters - image credit)
Law enforcement and government officials are concerned about the increase in social media apps like Snapchat (the messaging app is pictured on a phone screen) being used to lure and sexually exploit teens. (Thomas White/Reuters - image credit)

Law enforcement and government officials say they've seen an increase in social media platforms such as Snapchat being used to lure and sexually exploit children and youth since 2020.

They believe the rise is partly due to increased screen times as a result of the COVID-19 pandemic shifting people to largely communicate online.

In London, Ont., police have received numerous reports of adults using social media to inappropriately engage with minors, including the exchange of intimate photos and videos, and sexual solicitation.

"We're seeing exploitation in relation to children all across the age spectrum," said Det. Jeremy Dann of London police's Internet Child Exploitation (ICE) Unit. "Generally, it's on social media apps and chat rooms.

"Sextortion, or sexual-based extortion for profit, is becoming extremely common. Several times per week, we have reports coming in about a local child that's been victimized in this way," Dann said.

In December, David G. Nicol, 50, of London faced charges for a second time, after allegedly posing as a 14-year-old boy to sextort teens. He was charged with more than 30 sexual assault offences dating back to 2018. A few weeks later, London police's ICE Unit arrested a 44-year-old man from Scarborough in suburban Toronto for allegedly luring teens under age 16.

Although unrelated, in both cases, Snapchat was allegedly used to communicate with young people.

 

In 2021, Dann's unit saw a 38 per cent jump in reported incidents of online enticement and sexual exploitation, compared to the previous year. In 2022, that number rose to 40 per cent, he said.

But Dann said this surge is not unique to London. His team works with ICE units across Canada, and all of them are noticing the same increase in and patterns of exploitation, Dann said.

In an emailed statement to CBC News, a spokesperson for Snapchat wrote, "Snap was intentionally designed to be a visual communications platform for communicating with your real friends — and our product design includes safeguards to make it harder for strangers to find and contact younger people.

"We routinely work with safety experts and law enforcement to help combat [exploitation]; we have also rolled out new in-app safety tools called Family Center, with the goal of giving parents more insight into who their teens are communicating with on Snapchat," the statement reads.

 

App features 'appealing' to kids

Submitted by C3P
Submitted by C3P

"Many modern social media apps are designed to delete messages after they've been sent or received; they're designed to keep parents in the dark about what their child is truly up to," Dann said. "This feature poses a risk as it can hide problematic behaviours."

Catherine Tabak of the Winnipeg-based Canadian Centre for Child Protection (C3P) said platforms such as Instagram and Snapchat are created in a way that appeals to kids.

"With Snapchat, there's a false sense of security that information or pictures are being deleted and that there's no evidence of communication that's occurring between two people on that platform."

Tabak said adult offenders often connect with teens on one platform such as Instagram, and then move the conversation onto Snapchat.

Although online luring existed before the pandemic, its impact has snowballed into higher volumes of reported incidents, Tabak said. In 2022, her national tipline, cybertip.ca, received more than 800 reports, up from more than 600 in 2021.

But Tabak said these numbers are only the tip of the iceberg. The families who reach out to C3P often do so once the matter has escalated to a point where a child is seeking help, she said, adding that sometimes children communicate with offenders for several months.

Dann and Tabak gave these tips for parents and children:

What can parents do?

  • Monitor what your child is doing online.

  • Know who they're talking to and what that communication entails.

  • Don't let them talk to someone you don't personally know or trust.

  • Be familiar with the apps your child is using. If you don't know how an app works, don't let your child use it.

  • Don't let them use apps that hide or delete what they're doing.

  • Have the passwords for all devices and apps that require them.

  • Check children's devices frequently and unexpectedly.

  • Start conversations early on. If giving a child a device, talk about the potential threats. Tabak suggests using examples from the media, tell the child what you've read, and get their insight on it.

  • Keep an open line of dialogue with your children. Dann said that many times, they don't share these incidents with their parents due to shame and fear of punishment.

  • Ensure devices are used in an open area and have rules on times of the day your child is allowed to use them.

  • Cut off Wi-Fi at night, Tabak said, as most instances occur later in the evenings when parents are asleep.

  • Lead by example. Minimize using your own device at the dinner table, and teach children it's healthy to disconnect every now and then.

What should kids beware of?

  • Tabak said offenders often pretend to be someone from the child's community (such as a child from another school in the area, a friend of a friend, etc.).

  • If someone the child meets online tries to move communication from one platform to another (such as from Instagram onto Snapchat).

  • Comments that are sexual in nature, especially early on.

  • Pitting children against their parents, or anything is said about the parents being too strict or not understanding.

  • Persistence (such as if someone gets upset when the child doesn't respond within a certain timeframe, or when the child refuses to send photos or videos)

  • Coaching the child to delete messages that were exchanged, or the online person threatens to harm themselves if the child doesn't continue talking to them.

  • Excessive compliments and flattery, promises of a better life, saying things like they're the only ones who understand you.

  • Don't talk to strangers. Tabak said kids should apply the same principles they're taught about in-person relationships to the online sphere.

We need a laser focus: public safety minister

Isha Bhargava/CBC
Isha Bhargava/CBC

On Jan. 16, during an announcement in London, federal Public Safety Minister Marco Mendicino told CBC News the problem of luring is exacerbated with predators using social media's unique features to their advantage.

"We really have to make sure that we are focused like a laser on the subject matter so that we can prevent that crime from occurring as much as possible and also help victims and survivors of that kind of exploitation on the path to recovery," Mendicino said.

Ottawa's comprehensive strategy to deal with online harm includes investments in organizations like C3P, and a national strategy addressing child and human trafficking, Mendicino said.

Tabak believes social media giants also have a responsibility to improve moderation so there's a vetting process of who can create an account, along with better tools to report suspicious activity that allow people to give more context as opposed to just reporting an account.

This also includes getting rid of features that delete messages and "My Eyes Only," which is for snaps that a person wants to keep private and requires a password to access, Tabak said.

In the emailed statement to CBC, Snapchat said it uses technology to detect and combat abuse, and reports any known images and videos featuring child sexual exploitation to authorities, including C3P.

"We use machine learning-driven tools to help us identify keywords and account behaviours that suggest abusive accounts or other suspicious activity," the spokesperson said. "We use these signals to flag high-risk accounts for suspicious-activity review and are continuing to aggressively develop this capability."