West Virginia Attorney General JB McCuskey has filed a lawsuit against Apple (AAPL) for allegedly not stopping the storing of child sex abuse material, or CSAM, on its iCloud platform.
“Preserving the privacy of child predators is absolutely inexcusable. And more importantly, it violates West Virginia law,” McCuskey said. “Since Apple has so far refused to police themselves and do the morally right thing, I am filing this lawsuit to demand Apple follow the law, report these images, and stop re-victimizing children by allowing these images to be stored and shared.”
Apple shares were down 0.4% by noon trading on Thursday.
The suit was filed on Thursday morning in the Circuit Court of Mason County, West Virginia. It points out that under federal law, all U.S.-based technology companies are required to report detected CSAM to the National Center for Missing and Exploited Children (NCMEC).
It asserts Apple has failed to deploy adequate CSAM detection technology. For example, in 2023, Google (GOOG)(GOOGL) filed 1.47M reports to the NCMEC, while Meta Platforms (META) filed a staggering 30.6M reports. In contrast, Apple filed 267.
“As a direct and proximate result of its conduct, Apple is liable under multiple, independent theories of law, including strict liability for design defect, negligence for failing to implement adequate CSAM reporting technologies, creating or contributing to a public nuisance by facilitating the storage and hosting of unlawful CSAM content, and violations of the West Virginia Consumer Credit and Protection Act,” reads the lawsuit.
The suit also points out that Apple announced a proprietary set of CSAM detection tools in 2021 that would automatically detect it and report it to the NCMEC.
“However, by December 2022, in an effort to protect its brand and outsized smartphone and digital storage market share, Apple shelved the update and abandoned the project,” the lawsuit said.
Apple contends it maintains the safest platform for children.
“At Apple, protecting the safety and privacy of our users, especially children, is central to what we do,” Apple said in a statement to Seeking Alpha. “We are innovating every day to combat ever-evolving threats and maintain the safest, most trusted platform for kids. All of our industry-leading parental controls and features, like Communication Safety—which automatically intervenes on kids’ devices when nudity is detected in Messages, shared Photos, AirDrop and even live FaceTime calls—are designed with the safety, security, and privacy of our users at their core.”
Still, the lawsuit contends Apple’s devotion to user privacy has led to a proliferation of unreported CSAM on its platforms.
“While Apple may believe its decisions further its narrative around privacy as a core company value, the reality is that Apple’s decisions create market signals directed to users who value secrecy because they are engaged in high-risk activities,” the suit said. “Apple knows that the intrinsic design of its products draws bad actors.”
The suit calls on Apple “to take reasonable measures to abate the public nuisance its conduct has caused, created, and/or contributed to” as well as make “restitution, disgorgement, civil penalties, and all other pecuniary and monetary relief available under law.”