Apple has been sued for refusing to implement a system that would scan iCloud photos for child sexual abuse.
The New York Times writes about it.
The lawsuit says that because Apple doesnʼt do more to prevent the spread of child sexual abuse photos or videos, it causes victims to relive their trauma.
The lawsuit also alleges that Apple declared about "widely publicized improved designs aimed at protecting children" and then failed to "implement those designs or do anything to detect and limit" such material.
In 2021, Apple first announced an iCloud scanning system that would use digital signatures from the National Center for Missing and Exploited Children to detect child sexual abuse in photos or videos. However, security and privacy experts have suggested that the system could create a "backdoor" for government surveillance. Apple has now abandoned plans to implement a scanning system.
The lawsuit was filed by a 27-year-old woman who is suing Apple under a pseudonym. According to the woman, a relative molested her when she was a baby and published photos on the Internet. Thatʼs why she still receives reports almost every day from law enforcement agencies that someone has been accused of keeping these images.
Attorney James Marsh, who is involved in the lawsuit, says there is a potential pool of 2 680 people who could be eligible for compensation in the case.
For more news and in-depth stories from Ukraine please follow us on X.