Apple announced plans to scan US iPhones for photographs of child sexual assault, eliciting praise from child welfare organisations but raising concerns among security researchers that the system may be abused, notably by governments seeking to monitor their citizens. The technology “neuralMatch,” which is designed to recognise known photographs of child sexual abuse, will scan images before they are posted to iCloud.
The photograph will be examined by a person if it discovers a match. The user’s account will be disabled and the National Center for Missing and Exploited Children will be alerted if child pornography is discovered. Separately, as a kid safety safeguard, Apple plans to analyse users’ encrypted texts for sexually explicit content, which has upset privacy activists.
Surveillance of dissidents or demonstrators by the government could also be abused. Microsoft, Google, Facebook, and other technology corporations have been sharing digital fingerprints of known child sexual assault photos for years. Apple has employed them to search for child pornography in user files saved in its iCloud service, which is not as securely protected as its on-device data.
Be First to Comment