West Virginia sues Apple over child sex abuse material stored and shared on iCloud

4 hours ago 4

West Virginia’s attorney general filed a lawsuit on Thursday accusing Apple of allowing its iCloud service to become a vehicle for distributing child sexual abuse material.

The state alleges that the company facilitated the spread of child sexual abuse material by declining to deploy tools that scan photos and videos and detect such material in iCloud users’ collections.

JB McCuskey, a Republican, accused Apple of prioritizing user privacy over child safety. His office called the case the first of its kind by a government agency over the distribution of child sexual abuse material on Apple’s data storage platform.

“These images are a permanent record of a child’s trauma, and that child is revictimized every time the material is shared or viewed,” McCuskey said in the statement. “This conduct is despicable, and Apple’s inaction is inexcusable.”

Apple responded with a blanket denial to the allegations: “At Apple, protecting the safety and privacy of our users, especially children, is central to what we do. We are innovating every day to combat ever-evolving threats and maintain the safest, most trusted platform for kids.” The company emphasized its controls that prevent children from uploading or receiving nude images, though West Virginia’s lawsuit aims at abusers’ use of Apple devices and services.

In 2020, an executive in charge of fraud detection at Apple texted a colleague: “We are the greatest platform for distributing child porn.” The exchange was made public during the trial between Apple and Fortnite maker Epic Games.

A group of victims of child sexual exploitation sued Apple in 2024, making similar allegations, for $1.2bn in damages. The suit is ongoing. The same year, the UK’s National Society for the Prevention of Cruelty to Children (NSPCC) accused Apple of vastly undercounting how often child sexual abuse material appeared in its products. In a year, child predators used Apple’s iCloud, iMessage and FaceTime to store and exchange CSAM in a higher number of cases in England and Wales alone than the company reported across all other countries combined, according to police data obtained by the NSPCC. Apple did not comment on the finding at the time.

Apple routinely makes far fewer reports of CSAM to the National Center for Missing and Exploited Children (NCMEC) than Google or Meta, according to data published by NCMEC.

The company has considered scanning images saved in private iCloud accounts but abandoned the approach after concerns about user privacy and safety, including worries that it could be exploited by governments looking for other material for censorship or arrest.

McCuskey’s office filed the lawsuit in Mason county circuit court. The lawsuit seeks statutory and punitive damages and asks to have a judge force Apple to implement more effective measures to detect abusive material and implement safer product designs.

Alphabet’s Google, Microsoft and other platform providers check uploaded photos or emailed attachments against a database of identifiers of known child sex abuse material provided by the National Center for Missing and Exploited Children and other clearinghouses.

Until 2022, Apple took a different approach. It did not scan all files uploaded to its iCloud storage offerings, and the data was not end-to-end encrypted, meaning law enforcement officials could access it with a warrant.

Reuters in 2020 reported that Apple planned end-to-end encryption for iCloud, which would have put data into a form unusable by law enforcement officials. It abandoned the plan after the FBI complained it would harm investigations.

In August 2021, Apple announced NeuralHash, which it designed to balance the detection of child abuse material with privacy by scanning images on users’ devices before upload.

The system was criticized by security researchers who worried it could yield false reports of abuse material, and it sparked a backlash from privacy advocates who claimed it could be expanded to permit government surveillance.

A month later, Apple delayed the introduction of NeuralHash before canceling it in December 2022, the state said in its statement. That same month, Apple launched an option for end-to-end encryption for iCloud data.

The state said NeuralHash was inferior to other tools and could be easily evaded. It said Apple stores and synchronizes data through iCloud without proactive abuse material detection, allowing such images to circulate.

While Apple did not go through with the effort to scan images being uploaded to iCloud, it did implement a feature called Communication Safety that blurs nudity and other sensitive content being sent to or from a child’s device.

Federal law requires US-based technology companies to report abuse material to the National Center for Missing and Exploited Children. Apple in 2023 made 267 reports, compared with 1.47m by Google and 30.6m by Meta Platforms, the state said.

The state’s claims mirror allegations in a proposed class-action lawsuit filed against Apple in late 2024 in federal court in California by individuals depicted in such images.

Apple has moved to dismiss that lawsuit, saying the firm is shielded from liability under section 230 of the Communications Decency Act, a law that provides broad protections to internet companies from lawsuits over content generated by users.

Read Entire Article
Bhayangkara | Wisata | | |