“The trend is not slowing down,” Sheehan said.
A high-profile hearing Wednesday saw the CEOs of tech companies Meta, X, TikTok, Snap and Discord testify before the Senate Judiciary Committee about their respective efforts to combat child sexual abuse material, known as CSAM. The meeting will focus on this issue.
However, blaming the problem may turn out to be easier than solving it. The diffuse nature of the Internet, legal issues surrounding free speech and the liability of tech companies, and the fact that 90% of his reported CSAM is uploaded by people outside the United States complicate efforts to curb CSAM. I’m doing it.
Senators rally support for a series of bills aimed at expanding protections for children online, including a bill that would allow victims of child sexual abuse to sue platforms that facilitate exploitation. A public hearing is being convened for this purpose. But the proposal faces pushback from technology lobbyists and some digital rights groups who say it would undermine privacy protections and force platforms to inadvertently remove legitimate posts. Other measures focus on giving prosecutors more tools to go after those who spread CSAM.
Preventing child sexual exploitation is one of the rare issues on which Republicans and Democrats have the potential to unite. But over the years, technology has outpaced regulatory attempts. From naked photos of teenagers disseminated without their consent to graphic videos of young children being sexually assaulted, smartphones, surveillance devices, private messaging tools, and unmoderated online forums are putting the world at risk. This boom is further accelerated by the growing popularity of
“CSAM was once created and exchanged in secret online rings, but that has changed over the years,” said Carrie Goldberg, a lawyer specializing in sex crimes. “Most kids now have the tools in their hands, including their cell phones, to produce it themselves.”
Increasingly, online criminals are taking advantage of this situation by posing as friendly friends on social networks and messaging apps to lure teens into sending risqué photos and videos of themselves. doing. They then use these as leverage to demand more graphic videos and money, a form of blackmail known as “sextortion.”
The human toll can be severe, with some victims being abducted, forced into sexual slavery, and some committing suicide. Goldberg said many others are traumatized or live in fear of having their images and videos exposed to friends, parents and the wider world. NCMEC announced last year that sextortion schemes, which often target adolescent boys in particular, have been linked to at least a dozen suicides.
Reports of online seduction, including sextortion, will increase from 80,000 in 2022 to 186,000 in 2023, said Sheehan of NCMEC, which serves as a clearinghouse for online CSAM reports from around the world. It is said that it has swollen to . He noted that victims of looters are increasing in West African countries such as Ivory Coast and Nigeria, the latter of which has long been a hotbed for online fraud.
Although temptations are increasing, the majority of CSAM is still produced by abusers who “have legal access to children,” including “parents or guardians, relatives, babysitters, neighbors, etc.” Sheehan said this includes people from all over the world. More than 90 percent of his CSAM reported to NCMEC is uploaded to countries outside the United States, but the majority are on U.S.-based online platforms such as Meta’s Facebook, Instagram, Google, Snapchat, and Discord. discovered and reported. Tick-tock.
“Globally, we don’t have enough investigators to do this job,” Sheehan said, particularly overseas, which limits the ability to track down and prosecute perpetrators. At the same time, “many would argue that we cannot simply extricate ourselves from these problems. It is also technology companies that can better detect, remove, and stop bad actors on these platforms.” It depends.”
These companies have come under increased pressure in recent years to address this issue, either by actively monitoring CSAM or by redesigning products that specifically impact CSAM. In November, Omegle, a US-based platform notorious as a hub for pedophiles, was shut down amid a series of lawsuits, including one brought by Goldberg’s company. The app’s motto is “Talk to strangers!” — but it didn’t solve the case.
Mary Ann Franks, a George Washington University Law School professor and head of the Cyber Civil Rights Initiative, said Wednesday’s Senate hearing will determine whether lawmakers can turn bipartisan agreement that CSAM is a problem into meaningful legislation. He said he would try it.
“No one really defends the First Amendment rights of sex offenders,” she said. The challenge is enacting laws that force tech companies to police their platforms more aggressively without chilling broader legal expression online.
In the 1990s, as Americans began logging on to the Web via dial-up modems, Congress moved to criminalize the transmission of online pornography to children with the Communications Decency Act. But a year later, the Supreme Court struck down much of the law, ruling that its overly broad prohibitions would wipe out legally protected speech. Ironically, the law’s most lasting legacy is what became known as Section 230, which gives broad protection to websites and online platforms from civil liability for content posted by their users.
A 2008 law tasked the Department of Justice with CSAM efforts and required internet platforms to report known cases to NCMEC. However, a 2022 report by the Government Accountability Office found that many of the law’s requirements were not consistently met. And while the law requires U.S.-based internet platforms to report CSAM when they see it, there’s no need to look for it in the first place.
As a result, the companies that do the most to monitor CSAM do the worst in reports that show more instances of CSAM on their platforms than other companies, NCMEC’s Sheehan said. .
“Some companies, like Meta, are very careful to ensure that there are no parts of their network where this type of activity can occur,” he said. “But there are some large companies that have far fewer of them, and that’s because they choose not to look for them.”
Meta reported the most CSAM files on its platform in 2022. Company-specific data is available for the year, with over 21 million reports on Facebook alone. Google reported 2.2 million, Snapchat 550,000, TikTok 290,000, and Discord 170,000. Twitter, which has since changed its name to X, had just under 100,000 reports.
With over 2 billion active devices worldwide, Apple has reported only 234 CSAM incidents. Neither Google nor Apple were called to testify at last week’s hearing.
“Companies like Apple choose not to actively scan this type of content,” Sheehan said. “They are essentially creating a safe haven where very few people regularly report to CyberTipline.”
In 2022, Apple halted efforts to initiate CSAM scans on users’ iCloud Photos accounts after backlash from privacy advocates. Asked for comment, the company referred to an August 2023 statement in which it said CSAM is “abhorrent” but that iCloud scanning “has serious unintended consequences for users.” For example, Apple said it could create a “slippery slope” for other types of intrusion monitoring.
Even if CSAM is reported, NCMEC does not have the authority to investigate or prosecute the perpetrator. Instead, it acts as a clearinghouse and forwards reports to relevant law enforcement agencies. Follow-up methods can vary widely by jurisdiction, Sheehan said.
There is growing momentum in Congress to strengthen protections for children online, but no major new legislation has yet been implemented. The Senate Judiciary Committee advanced several proposals with unanimous support, but they have since stalled in the Senate with no clear timeline for proponents to bring them to the floor.
Sen. Dick Durbin (D-Ill.), who chairs the committee holding the hearing, said in an interview that Senate Majority Leader Charles E. Schumer (N.Y.) remains committed to bringing the bill to a floor vote. He said he had not. Even if Mr. Schumer were to do so, the policy would need to garner significant support in the House, and several key measures have yet to be introduced.
Looming over any attempt to chip away at tech platforms’ liability shields is a 2018 law called SESTA-FOSTA that repeals Section 230 protections for promoting content that includes sex trafficking. . Critics argue that the law forced companies to crack down on sexual content in many other legal forms, ultimately harming sex workers more than it helped.
Durbin said the hearing is ultimately about holding companies accountable for how their platforms can put children in harm’s way.
“As far as I know, there are no heroes in this conversation,” he said of witness companies in an interview. “They are all making conscious, profit-driven decisions that do not protect children or ensure safety in the process.”
Goldberg said certain types of features in online apps are particularly attractive to child predators. She said predators are particularly flocking to apps that attract large numbers of children, provide a way to contact unknown adults, and allow access to cameras and private communications between users.
He claimed that many companies know that the design of their apps promotes child abuse but “refuse to fix it”, citing laws that limit liability. . “The only way to pressure companies to repair their products is to make them pay for the damage,” she says.
Franks agreed that politicians’ taunting of tech CEOs won’t help unless it’s backed up by legislation that changes the incentives companies face.
“You want to embarrass these companies, right? You want to highlight all these horrible things that have come to light,” she said. “But you’re not really changing the fundamental structure.”