News

Snap Employees Privately Admitted Tens of Thousands of Minors Were Being Abused on the Platform

Snapchat app is seen on a smartphone (Dado Ruvic/Illustration/Reuters)

In one exchange, a Snap employee admits that the 10,000 reports of abuse being flagged each month represented a ‘small fraction’ of the real number.

Sign in here to read more.

Employees of Snapchat’s parent company admitted they were not doing nearly enough to stamp out the harassment and abuse of minors on the social-media platform in a series of internal private messages released as part of a lawsuit filed by New Mexico attorney general Raúl Torrez.

New Mexico alleges Snapchat is a “breeding ground” for abuse and sextortion, in which adults coerce minors into sharing explicit images to blackmail the underage victims. These illicit activities are made possible due, in part, to Snapchat’s feature that allows photos and videos to disappear after 24 hours.

The unredacted complaint, filed Tuesday in Santa Fe County, reveals a slate of internal documents and messages to back up its assertion that Snap ignored reports of sextortion and failed to implement an age-verification system, among other shortcomings. The lawsuit was initially filed on September 5.

In a November 2022 internal email, one Snap employee raised concerns regarding “10,000 user reports of sextortion each month” and the “psychological impact of sextortion” on minors victimized by predators. A second employee responded, saying the number of sextortion reports “likely represents a small fraction of this abuse as this is an embarrassing issue that is not easy to categorize in reporting.”

A month later, a Snap marketing brief focusing on sexting and sextortion admitted that the platform has a “deeply pernicious and dangerous” problem with adults targeting minors. In determining how to best warn underage users and parents of the risks associated with their platform, Snap employees wanted to be careful to avoid “striking fear” into them.

“We need to run through a very thoughtful messaging & visual storytelling exercise/session on how to best balance education without striking fear into Snapchatters,” reads the company’s 2022 internal marketing brief.

Snap even acknowledged that sextortion reports were falling “through the cracks” and that law enforcement was taking “no action” on reports of users “being sextorted or asked for nudes (which we know is often the start of sextortion),” according to internal chats. However, rather than implement safeguards to address the issue, Snap only complained that moderating sexually explicit content would “create disproportionate admin costs” and said that should not be its responsibility.

Notably, Snap employees said they were aware of an account that “had 75 different reports against it since Oct. ’21, mentioning nudes, minors, and extortion, yet the account was still active.”

Furthermore, Snapchat effectively doesn’t verify the ages of users to prevent abuse and sextortion, though it has said in the past that users must be 13 years or older to access the platform. In May 2022, a Snap executive emailed, “I don’t think we can say that we actually verify” users’ ages.

The email was sent in response to a Washington Post news article about a teenage-girl victim suing Snapchat at the time for failing to prevent sexual exploitation.

“Snap says users must be 13 or older, but the app, like many other platforms, doesn’t use an age-verification system, so any child who knows how to type a fake birthday can create an account,” the Post‘s Drew Harwell wrote.

In 2022, the company was advised by an external consultant that sextortion is fairly common on Snapchat. This was corroborated by Snap’s own research that same year, which found that more than one-third of teen girls and 30 percent of teen boys received unwanted sexual advances on the app.

In fact, the app was actually prompting minors to connect with unknown adults through its Quick Add feature, one employee pointed out in an internal message. IN at least one case, the Quick Add feature led to tragedy: In 2023, Alejandro Marquez pleaded guilty to raping and murdering an eleven-year-old New Mexico girl whom he found using Quick Add.

Moreover, over half of Gen Z respondents said a year later that they or a friend were catfished into sharing personal information or sexually explicit images on the platform. Most of those respondents shared the solicited images or information, Snap found.

“We designed Snapchat as a place to communicate with a close circle of friends, with built-in safety guardrails, and have made deliberate design choices to make it difficult for strangers to discover minors on our service,” a Snap spokesperson said in response to National Review’s request for comment on the unredacted complaint.

“We continue to evolve our safety mechanisms and policies, from leveraging advanced technology to detect and block certain activity, to prohibiting friending from suspicious accounts, to working alongside law enforcement and government agencies, among so much more.”

When the lawsuit was filed last month, a Snap spokesperson said it would respond to the New Mexico attorney general’s claims in court after carefully reviewing the allegations and maintained the company remains committed to “keeping young people safe online.”

Earlier this year, Snap CEO Evan Spiegel testified before Congress about how his company has been working to “proactively detect these bad actors on our service and seek to intervene before the conversation can escalate to extortion.” Spiegel boasted that his team typically acts “within 15 minutes” to prevent sextortion after first receiving reports of harassment or sexual content from its user base. He disclosed that of the 690,000 instances of child sexual abuse material that Snap reported in 2023, 1,000 arrests were made.

“Today’s filing is further confirmation that Snapchat’s harmful design features create an environment that fosters sextortion, sexual abuse and unwanted contact from adults to minors,” Torrez said in a statement.

“It is disheartening to see that Snap employees have raised many red flags that have continued to be ignored by executives,” he added. “What is even more disturbing is that unredacted information shows that the addicting features on Snapchat were blatantly acknowledged and encouraged to remain active on the platform.”

The suit is the product of a months-long undercover investigation, during which the New Mexico Department of Justice found more than 10,000 records involving Snap and child sexual abuse material on the dark web in the last year, Torrez announced among the state department’s primary findings last month. The records included images and videos of minors under the age of 13 being sexually assaulted.

New Mexico officials concluded Snapchat is the largest source of such material from its extensive investigation of various dark-web sites.

New Mexico is also suing Meta for allegedly encouraging child sexual exploitation on its Facebook and Instagram platforms. Torrez made significant progress in the case after a state judge denied Meta’s motion to dismiss the lawsuit in May.

The Democratic attorney general intends to hold both Snap and Meta accountable for prioritizing their business growth over children’s safety.

David Zimmermann is a news writer for National Review. Originally from New Jersey, he is a graduate of Grove City College and currently writes from Washington, D.C. His writing has appeared in the Washington Examiner, the Western Journal, Upward News, and the College Fix.
You have 1 article remaining.
You have 2 articles remaining.
You have 3 articles remaining.
You have 4 articles remaining.
You have 5 articles remaining.
Exit mobile version