“Apple UK Under Fire for Neglecting to Report Child Sexual Images, Watchdog Alleges”

Apple UK child images linked to predators
Spread the love
  • Apple UK implicated in more cases of predators sharing child abuse imagery in England and Wales than reported globally in a year
  • Apple accused of failing to effectively monitor platforms for child sexual abuse material (CSAM)
  • NSPCC accuses Apple of vastly undercounting instances of CSAM on its products
  • Apple reported 267 suspected CSAM cases globally in 2023, significantly lower than peers like Google and Meta
  • Concerns raised over Apple’s decision to not implement iCloud photo-scanning tool and launch AI system, risking increase in AI-generated CSAM and impact on child safety
Summarized Article:

https://www.theguardian.com/technology/article/2024/jul/22/apple-security-child-sexual-images-accusation



Related Video
Published on: August 13, 2021 Description: With Communication Safety in Messages and Image detection for iCloud Photo Library coming to the iPhone, iPad, and Mac with ...
Fixing Apple Child Safety Photo Scanning
Play


Related Wikipedia Articles

Topics: No response

Response
Response may refer to: Call and response (music), musical structure Reaction (disambiguation) Request–response Output or response, the result of telecommunications input Response (liturgy), a line answering a versicle Response (music) or antiphon, a response to a psalm or other part of a religious service Response, a phase in emergency management...
Read more: Response

Author:

Leave a Reply

Your email address will not be published. Required fields are marked *