A 27-year-old woman who was a victim of child sexual abuse is suing Apple for over $1.2 billion, claiming the company failed to implement its own system to detect and remove child pornography from its iCloud service. The lawsuit alleges that Apple broke its promise to protect victims of child sexual abuse by not implementing its NeuralHash system, which led to the proliferation of illegal material on its platform.
Key Points
Apple facing $1.2 billion lawsuit for failing to detect child pornography on iCloud
Lawsuit alleges Apple broke promise to protect victims of child sexual abuse
Apple introduced and then abandoned NeuralHash system for detecting illegal material
Plaintiff seeks compensation for victims of child sexual abuse
Pros
Seeking justice for victims of child sexual abuse
Potential for significant implications for tech industry accountability
Cons
Potential privacy concerns with implementing systems like NeuralHash