One of the internet’s oldest, ugliest problems keeps getting worse.
Despite decades of efforts to crack down on sexual pictures and videos of children online, they’re more widely available now than ever, according to new data from the nonprofit tasked by the U.S. government with tracking such material. John Shehan, head of the exploited children division at the National Center for Missing and Exploited Children, says reports of child sexual abuse material on online platforms grew from 32 million in 2022 to a record high of more than 36 million in 2023.
“The trends aren’t slowing down,” Shehan said.
On Wednesday, a high-profile hearing will spotlight the issue as the CEOs of tech companies Meta, X, TikTok, Snap and Discord testify before the Senate Judiciary Committee on their respective efforts to combat child sexual abuse material, known as CSAM.
Advertisement
But decrying the problem may prove easier than solving it. The diffuse nature of the internet, legal questions around free speech and tech company liability, and the fact that 90 percent of reported CSAM is uploaded by people outside the United States all complicate efforts to rein it in.
Senators are convening the hearing as they look to build support for a suite of bills intended to expand protections for children online, including a measure that would allow victims of child sexual abuse to sue platforms that facilitate exploitation. But the proposals have faced pushback from tech lobbyists and some digital rights groups, who argue they would undermine privacy protections and force platforms to inadvertently take down lawful posts. Other measures focus on giving prosecutors more tools to go after those who spread CSAM.
Preventing the sexual exploitation of children is one of the rare issues with the potential to unite Republicans and Democrats. Yet over the years, technology has outpaced attempts at regulation. From naked pictures of teens circulated without their consent to graphic videos of young children being sexually assaulted, the boom has been fueled by the ever-wider global availability of smartphones, surveillance devices, private messaging tools and unmoderated online forums.
Advertisement
“CSAM has changed over the years, where it once was produced and exchanged in secretive online rings,” said Carrie Goldberg, a lawyer who specializes in sex crimes. “Now most kids have tools in the palm of their hands — i.e., their own phones — to produce it themselves.”
Increasingly, online predators take advantage of that by posing as a flirty peer on a social network or messaging app to entice teens to send compromising photos or videos of themselves. Then they use those as leverage to demand more graphic videos or money, a form of blackmail known as “sextortion.”
The human costs can be grave, with some victims being abducted, being forced into sex slavery or killing themselves. Many others, Goldberg said, are emotionally scarred or live in fear of their images or videos being exposed to friends, parents and the wider world. Sextortion schemes in particular, often targeting adolescent boys, have been linked to at least a dozen suicides, NCMEC said last year.
Advertisement
Reports of online enticement, including sextortion, ballooned from 80,000 in 2022 to 186,000 in 2023, said Shehan of NCMEC, which serves as a clearinghouse for reports of online CSAM from around the world. A growing number are being perpetrated by predators in West African countries, he noted, including Ivory Coast and Nigeria, the latter of which has long been a hotbed for online scams.
Even as enticement is on the rise, the majority of CSAM is still produced by abusers who have “legitimate access to children,” Shehan said, including “parents and guardians, relatives, babysitters and neighbors.” While more than 90 percent of CSAM reported to NCMEC is uploaded in countries outside the United States, the vast majority of it is found on, and reported by, U.S.-based online platforms, including Meta’s Facebook and Instagram, Google, Snapchat, Discord and TikTok.
“Globally, there aren’t enough investigators to do this work,” Shehan said, limiting the ability to track down and prosecute the perpetrators, especially overseas. At the same time, “many would argue we can’t just arrest our way out of these issues. It’s also on the tech companies that can better detect, remove and prevent bad actors from being on these platforms.”
Advertisement
Those companies have faced increasing pressure in recent years to address the problem, whether by proactively monitoring for CSAM or altering the design of products that are especially conducive to it. In November, one U.S.-based platform called Omegle that had become infamous as a hub for pedophiles shut down amid a string of lawsuits, including some filed by Goldberg’s firm. The app’s motto — “Talk to strangers!” — didn’t help its case.
Wednesday’s Senate hearing will test whether lawmakers can turn bipartisan agreement that CSAM is a problem into meaningful legislation, said Mary Anne Franks, professor at George Washington University Law School and president of the Cyber Civil Rights Initiative.
“No one is really out there advocating for the First Amendment rights of sexual predators,” she said. The difficulty lies in crafting laws that would compel tech companies to more proactively police their platforms without chilling a much wider range of legal online expression.
Advertisement
In the 1990s, as Americans began to log on to the web via dial-up modems, Congress moved to criminalize the transmission of online pornography to children with the Communications Decency Act. But the Supreme Court struck down much of the law a year later, ruling that its overly broad prohibitions would sweep up legally protected speech. Ironically, the act’s most enduring legacy was what has become known as Section 230, which gave websites and online platforms broad protections from civil liability for content their users post.
A 2008 law tasked the Justice Department with tackling CSAM and required internet platforms to report any known instances to NCMEC. But a 2022 report by the Government Accountability Office found that many of the law’s requirements had not been consistently fulfilled. And while the law requires U.S.-based internet platforms to report CSAM when they find it, it doesn’t require them to look for it in the first place.
The result, NCMEC’s Shehan said, is that the companies that do the most to monitor for CSAM come out looking the worst in reports that show more examples of CSAM on their platforms than others.
Advertisement
“There are some companies like Meta who go above and beyond to make sure that there are no portions of their network where this type of activity occurs,” he said. “But then there are some other massive companies that have much smaller numbers, and it’s because they choose not to look.”
Meta reported by far the largest number of CSAM files on its platforms in 2022, the most recent year for which company-specific data is available, with more than 21 million reports on Facebook alone. Google reported 2.2 million, Snapchat 550,000, TikTok 290,000 and Discord 170,000. Twitter, which has since been renamed X, reported just under 100,000.
Apple, which has more than 2 billion devices in active use around the world, reported just 234 incidents of CSAM. Neither Google nor Apple was called to testify in this week’s hearing.
Advertisement
“Companies like Apple have chosen not to proactively scan for this type of content,” Shehan said. “They’ve essentially created a safe haven that keeps them to a very, very small number of reports into the CyberTipline on a regular basis.”
In 2022, Apple scrapped an effort to begin scanning for CSAM in users’ iCloud Photos accounts after a backlash from privacy advocates. Asked for comment, the company referred to an August 2023 statement in which it said CSAM is “abhorrent” but that scanning iCloud would “pose serious unintended consequences for our users.” For instance, Apple said, it could create a “slippery slope” to other kinds of invasive surveillance.
Even when CSAM is reported, NCMEC doesn’t have the authority to investigate or prosecute the perpetrators. Instead, it serves as a clearinghouse, forwarding reports to the relevant law enforcement agencies. How they follow up can vary widely among jurisdictions, Shehan said.
Advertisement
In Congress, momentum to strengthen online child safety protections has been building, but it has yet to translate to major new laws. While the Senate Judiciary Committee has advanced some proposals with unanimous support, they have since languished in the Senate with no clear timetable for proponents to bring them to the floor.
Sen. Dick Durbin (D-Ill.), who chairs the panel holding the hearing, said in an interview that Senate Majority Leader Charles E. Schumer (D-N.Y.) has not yet committed to bringing the bills to a floor vote. Even if Schumer did, the package would still need to gain significant traction in the House, where several key measures have yet to be introduced.
Looming over any attempt to chip away at tech platforms’ liability shield is a 2018 law called SESTA-FOSTA, which rolled back Section 230 protections for facilitating content involving sex trafficking. Critics say the law led companies to crack down on many other legal forms of sexual content, ultimately harming sex workers as much or more than it helped them.
Durbin said that the hearing is ultimately about holding the companies accountable for the way their platforms can expose children to harm.
“There are no heroes in this conversation as far as I’m concerned,” he said of the witness companies in an interview. “They’re all making conscious, profit-driven decisions that do not protect children or put safety into the process.”
Goldberg said specific types of features in online apps are especially attractive to child predators. In particular, she said, predators flock to apps that attract lots of children, give adult strangers a way to contact them, and allow camera access and private communication between users.
She argued that many companies know their apps’ designs facilitate child abuse but “refuse to fix it” because of laws that limit their liability. “The only way to pressure corporations to repair their products is to make them pay for their harms,” she said.
Politicians browbeating tech CEOs won’t help unless it’s backed up by laws that change the incentives their companies face, Franks agreed.
“You want to embarrass these companies. You want to highlight all these terrible things that have come to light,” she said. “But you’re not really changing the underlying structure.”
ncG1vNJzZmivp6x7uK3SoaCnn6Sku7G70q1lnKedZMGmr8enpqWnl658c3yRbWZpaV9nhXCv0pqkZqaTorKkedKepZqslWK1pq3RoqWgZZOdtq2wjKmmq6Zf