Teen girl sexually exploited on Snapchat takes on American tech

File photo: REUTERS/Dado Ruvic

File photo: REUTERS/Dado Ruvic

Published May 7, 2022

Share

Washington – She was 12 when he started demanding nude photos, saying she was pretty, that he was her friend. She believed, because they had connected on Snapchat, that her photos and videos would disappear.

Now, at 16, she is leading a class action lawsuit against an app that has become a mainstay of American teen life – claiming its designers have done almost nothing to prevent the sexual exploitation of girls like her.

Her case against Snapchat reveals a haunting story of shame and abuse inside a video-messaging app that has for years flown under legislators’ radar, even as it has surpassed 300 million active users and built a reputation as a safe space for young people to trade their most intimate images and thoughts.

But it also raises difficult questions about privacy and safety, and it throws a harsh spotlight on the tech industry’s biggest giants, arguing that the systems they depend on to root out sexually abusive images of children are fatally flawed.

“There isn’t a kid in the world who doesn't have this app,” the girl’s mother said, “and yet an adult can be in correspondence with them, manipulating them, over the course of many years, and the company does nothing. How does that happen?”

In the lawsuit, filed in a California federal court this week, the girl – requesting anonymity as a victim of sexual abuse and referred to only as LW – and her mother accuse Snapchat of negligently failing to design a platform that could protect its users from “egregious harm”.

The man – an active-duty Marine who was convicted in a military court last year of charges related to child pornography and sexual abuse – saved her Snapchat photos and videos and shared them with others around the web, a criminal investigation found.

Snapchat’s parent company, Snap, has defended its app’s core features of self-deleting messages and instant video chats as helping young people speak openly about important parts of their lives.

The company said it employed “the latest technologies” and developed its own software “to help us find and remove content that exploits or abuses minors”.

“While we cannot comment on active litigation, this is tragic, and we are glad the perpetrator has been caught and convicted,” Snap spokeswoman Rachel Racusen said. “Nothing is more important to us than the safety of our community.”

Founded in 2011, the California company told investors last month that it had 100 million daily active users in North America, more than double Twitter’s following in the US, and that it was used by 90% of US residents aged 13 to 24 – a group it designated the “Snapchat Generation”.

For every user in North America, the company said, it received about $31 in advertising revenue last year. Now worth nearly $50 billion (about R800bn), the public company has expanded its offerings to include augmented-reality camera glasses and auto-flying selfie drones.

But the lawsuit likens Snapchat to a defective product, saying it has focused more on innovations to capture children’s attention than on effective tools to keep them safe.

The app relies on “an inherently reactive approach that waits until a child is harmed and places the burden on the child to voluntarily report their own abuse”, the girl’s lawyers wrote. “These tools and policies are more effective in making these companies wealthier than (in) protecting the children and teens who use them.”

Apple and Google are also listed as defendants in the case because of their role in hosting an app, Chitter, that the man had used to distribute the girl’s images. Both companies said they had removed the app from their stores on Wednesday.

Apple spokesman Fred Sainz said the app had repeatedly broken Apple’s rules around “proper moderation of all user-generated content”. Google spokesman José Castañeda said the company was “deeply committed to fighting online child sexual exploitation” and has invested in techniques to find and remove abusive content. Chitter’s developers did not respond to requests for comment.

The lawsuit seeks at least $5 million in damages and assurances that Snap will invest more in protection. But it could send ripple effects through not just Silicon Valley but Washington, by calling out how the failures of federal legislators to pass tech regulation have left the industry to police itself.

“We cannot expect the same companies that benefit from children being harmed to go and protect them,” said Juyoun Han, the girl’s attorney. “That’s what the law is for.”

Brian Levine, a professor at the University of Massachusetts at Amherst who studies children’s online safety and digital forensics and is not involved in the litigation, said the legal challenge adds to the evidence that the country’s lack of tech regulation has left young people at risk.

“How is that all of the carmakers and all of the other industries have regulations for child safety, and one of the most important industries in America has next to nothing?” Levine said.

“Exploitation results in lifelong victimisation for these kids,” and it’s being fostered on online platforms developed by “what are essentially the biggest toymakers in the world, Apple and Google”, he added. “They’re making money off these apps and operating like absentee landlords … After some point, don’t they bear some responsibility?”

While most social networks focus on a central feed, Snapchat revolves around a user’s inbox of private “snaps” – the photos and videos they exchange with friends, each of which self-destructs after being viewed.

The simple concept of vanishing messages has been celebrated as a kind of anti-Facebook, creating a low-stakes refuge where anyone can express themselves as freely as they want without worrying how others might react.

Snapchat, in its early years, was often derided as a “sexting app”, and for some users the label still fits. But its popularity has also solidified it as a more broadly accepted part of digital adolescence – a place for joking, flirting, organising and working through the joys and awkwardness of teenage life.

In the first three months of this year, Snapchat was the seventh-most-downloaded app in the world, installed twice as often as Amazon, Netflix, Twitter or YouTube, estimates from the analytics company Sensor Tower show. Jennifer Stout, Snap’s vice-president of global public policy, told a US Senate panel last year that Snapchat was an “antidote” to mainstream social media and its “endless feed of unvetted content”.

Snapchat photos, videos and messages are designed to automatically vanish once the recipient sees them or after 24 hours. But Snapchat’s carefree culture has raised fears that it’s made it too easy for young people to share images they may one day regret.

Snapchat allows recipients to save some photos or videos within the app, and it notifies the sender if a recipient tries to capture a photo or video marked for self-deletion. But third-party workarounds are rampant, allowing recipients to capture them undetected.

Parent groups also worry the app is drawing in adults looking to prey on a younger audience. Snap has said it accounts for “the unique sensitivities and considerations of minors” when developing the app, which now bans users younger than 18 from posting publicly in places such as Snap Maps and limits how often children and teens are served up as “Quick Add” friend suggestions in other users' accounts.

The app encourages people to talk with friends they know from real life and only allows someone to communicate with a recipient who has marked them as a friend.

The company said that it takes fears of child exploitation seriously. In the second half of 2021, the company deleted roughly 5 million pieces of content and nearly 2 million accounts for breaking its rules around sexually explicit content, a transparency report said last month. About 200 000 of those accounts were axed after sharing photos or videos of child sexual abuse.

But Snap representatives have argued they’re limited in their abilities when a user meets someone elsewhere and brings that connection to Snapchat. They’ve also cautioned against more aggressively scanning personal messages, saying it could devastate users’ sense of privacy and trust.

Some of its safeguards, however, are fairly minimal. Snap says users must be 13 or older, but the app, like many other platforms, doesn't use an age-verification system, so any child who knows how to type a fake birthday can create an account. Snap said it works to identify and delete the accounts of users younger than 13 – and the Children’s Online Privacy Protection Act, or Coppa, bans companies from tracking or targeting users under that age.

Snap says its servers delete most photos, videos and messages once both sides have viewed them, and all unopened snaps after 30 days. Snap said it preserves some account information, including reported content, and shares it with law enforcement when legally requested. But it also tells police that much of its content is “permanently deleted and unavailable”, limiting what it can turn over as part of a search warrant or investigation.

In 2014, the company agreed to settle charges from the Federal Trade Commission alleging that Snapchat had deceived users about the “disappearing nature” of their photos and videos, and collected geolocation and contact data from their phones without their knowledge or consent.

Snapchat, the commission said, had also failed to implement basic safeguards, such as verifying people’s phone numbers. Some users had ended up sending “personal snaps to complete strangers” who had registered with phone numbers that weren't actually theirs.

A Snapchat representative said at the time that “while we were focused on building, some things didn't get the attention they could have”. The commission required the company to submit to monitoring from an “independent privacy professional” until 2034.

Like many major tech companies, Snapchat uses automated systems to patrol for sexually exploitative content: PhotoDNA, built in 2009, to scan still images, and CSAI Match, developed by YouTube engineers in 2014, to analyse videos.

The systems work by looking for matches against a database of previously reported sexual-abuse material run by the government-funded National Centre for Missing and Exploited Children (NCMEC).

But neither system is built to identify abuse in newly captured photos or videos, even though those have become the primary ways Snapchat and other messaging apps are used today.

When the girl began sending and receiving explicit content in 2018, Snap didn’t scan videos at all. The company started using CSAI Match only in 2020.

In 2019, a team of researchers at Google, the NCMEC and the anti-abuse non-profit organisation Thorn had argued that even systems such as those had reached a breaking point. The “exponential growth and the frequency of unique images”, they argued, required a “reimagining” of child-sexual-abuse-imagery defences away from the blacklist-based systems tech companies had relied on for years.

They urged the companies to use recent advances in facial-detection, image-classification and age-prediction software to automatically flag scenes where a child appears at risk of abuse and alert human investigators for further review.

“In the absence of new protections, society will be unable to adequately protect victims of child sexual abuse,” the researchers wrote.

Three years later, such systems remain unused. Some similar efforts have also been halted because of criticism they could improperly pry into people’s private conversations or raise the risks of a false match.

In September, Apple indefinitely postponed two proposed systems – to detect possible sexual-abuse images stored online and to block potentially harmful messages from being sent to children – following a firestorm that the technology could be misused for surveillance or censorship.

Privacy advocates have cautioned that more rigorous online policing could end up penalising children for being children. They’ve also worried that such concerns could further fuel a moral panic, in which some conservative activists have called for the axing of LGBTQ teachers who discuss gender or sexual orientation with their pupils, falsely equating it to child abuse.

But the case adds to a growing wave of lawsuits challenging tech companies to take more responsibility for their users’ safety – and arguing that past precedents should no longer apply.

The companies have traditionally argued in court that one law, Section 230 of the Communications Decency Act, should shield them from legal liability related to the content their users post. But lawyers have increasingly argued that the protection should not inoculate the company from punishment for design choices that promoted harmful use.

In one case filed in 2019, the parents of two boys killed when their car smashed into a tree at 180km/h while recording a Snapchat video sued the company, saying its “negligent design” decision to allow users to imprint real-time speedometers on their videos had encouraged reckless driving.

A California judge dismissed the suit, citing Section 230, but a federal appeals court revived the case last year, saying it centred on the “predictable consequences of designing Snapchat in such a way that it allegedly encouraged dangerous behaviour”. Snap has since removed the “Speed Filter”. The case is continuing.

In a separate lawsuit, the mother of an 11-year-old Connecticut girl sued Snap and Instagram parent company Meta this year, alleging that she had been routinely pressured by men on the apps to send sexually explicit photos of herself – some of which were later shared around her school. The girl killed herself last year, the mother said, due in part to her depression and shame from the episode.

Congress has voiced some interest in passing more-robust regulation, with a bipartisan group of senators writing a letter to Snap and dozens of other tech companies in 2019 asking about what proactive steps they had taken to detect and stop online abuse.

But the few proposed tech bills have faced immense criticism, with no guarantee of becoming law. The most notable – the Earn It Act, which was introduced in 2020 and passed a Senate committee vote in February – would open tech companies to more lawsuits over child-sexual-abuse imagery, but technology and civil rights advocates have criticised it as potentially weakening online privacy for everyone.

Some tech experts note that predators can contact children on any communications medium and that there is no simple way to make every app completely safe.

Snap’s defenders say applying some traditional safeguards – such as the nudity filters used to screen out pornography around the web – to personal messages between consenting friends would raise its own privacy concerns.

But some still question why Snap and other tech companies have struggled to design new tools for detecting abuse.

Hany Farid, an image-forensics expert at the University of California at Berkeley, who helped develop PhotoDNA, said safety and privacy have for years taken a “back seat to engagement and profits”.

The fact that PhotoDNA, now more than a decade old, remains the industry standard “tells you something about the investment in these technologies”, he said. “The companies are so lethargic in terms of enforcement and thinking about these risks … at the same time, they’re marketing their products to younger and younger kids.”

Farid, who has worked as a paid adviser to Snap on online safety, said that he believes the company could do more but that the problem of child exploitation is industry-wide.

“We don’t treat the harms from technology the same way we treat the harms of romaine lettuce,” he said. “One person dies, and we pull every single head of romaine lettuce out of every store,” yet the children's exploitation problem is decades old. “Why do we not have spectacular technologies to protect kids online?”

The girl said the man messaged her randomly one day on Instagram in 2018, just before her 13th birthday. He fawned over her, she said, at a time when she was feeling self-conscious. Then he asked for her Snapchat account.

“Every girl has insecurities,” said the girl, who lives in California. “With me, he fed on those insecurities to boost me up, which built a connection between us. Then he used that connection to pull strings.” .

He started asking for photos of her in her underwear, then pressured her to send videos of herself nude, then more explicit videos to match the ones he sent of himself. When she refused, he berated her until she complied, the lawsuit states. He always demanded more.

She blocked him several times, but he messaged her through Instagram or via fake Snapchat accounts until she started talking to him again, the lawyers wrote. Hundreds of photos and videos were exchanged over a three-year span.

The girl said she felt ashamed, but was afraid to tell her parents. She also worried what he might do if she stopped. She thought reporting him through Snapchat would do nothing, or that it could lead to her name getting out, the photos following her for the rest of her life.

“I thought this would be a secret,” she said. “That I would just keep this to myself forever.” (Snap officials said users can anonymously report concerning messages or behaviours, and that its “trust and safety” teams respond to most reports within two hours.)

One day she saw some boys at school laughing at nude photos of young girls and realised it could have been her. She built up her confidence over the next week. Then she sat with her mother in her bedroom and told her what had happened.

Her mother said that she had tried to follow the girl’s public social media accounts and saw no red flags. She had known her daughter used Snapchat, like all of her friends, but the app is designed to give no indication of who someone is talking to or what they’ve sent. In the app, when she looked at her daughter's profile, all she could see was her cartoon avatar.

The lawyers cite Snapchat’s privacy policy to show that the app collects troves of data about its users, including their location and who they communicate with – enough, they argue, that Snap should be able to prevent more users from being “exposed to unsafe and unprotected situations”.

Stout, the Snap executive, told the Senate Commerce, Science and Transportation Committee’s consumer protection panel in October that the company was building tools to “give parents more oversight without sacrificing privacy”, including letting them see their children’s friends list and who they’re talking to. A company spokesman said those features were slated for release in a few months.

Thinking back to those years, the mother said she’s devastated. The Snapchat app, she believes, should have known everything, including that her daughter was a young girl. Why did it not flag that her account was sending and receiving so many explicit photos and videos? Why was no one alerted that an older man was constantly messaging her using overtly sexual phrases, telling her things like “lick it up”?

After the family called the police, the man was charged with sexual abuse of a child involving indecent exposure as well as the production, distribution and possession of child pornography.

At the time, the man had been a US Marine Corps lance corporal stationed at a military base, according to court-martial records.

As part of the Marine Corps' criminal investigation, the man was found to have coerced other underage girls into sending sexually explicit videos that he then traded with other accounts on Chitter. The lawsuit cites a number of Apple App Store reviews from users saying the app was rife with “creeps” and “paedophiles” sharing sexual photos of children.

The man told investigators he used Snapchat because he knew the “chats will go away”. In October, he was dishonourably discharged and sentenced to seven years in prison, the court-martial records show.

The girl said she has suffered from guilt, anxiety and depression after years of quietly enduring the exploitation and had attempted suicide. The pain “is killing me faster than life is killing me”, she said in the lawsuit.

Her mother said that the past year had been devastating, and that she worried about teens like her daughter – the funny girl with the messy room, who loves to dance, who wants to study psychology so she can understand how people think.

“The criminal gets punished, but the platform doesn’t. It doesn't make sense,” the mother said. “They’re making billions of dollars on the backs of their victims, and the burden is all on us.”