Apple is introducing a two-pronged system that scans photos on its devices for content that could be classified as Child Sexual Abuse Material (CSAM). While child protection agencies applaud the move, digital privacy advocates and industry peers are raising concerns that the technology could have far-reaching implications for user privacy.
Apple’s neuralMatch tool will check for photos before uploading them to iCloud and examine the content of messages sent through its end-to-end encrypted iMessage app as part of the mechanism. According to Apple, ‘the Messages app will use on-device machine learning to warn about sensitive content while keeping private communications unreadable by Apple.’
Trending Today: Next addition to bizarre foods, Fanta Omelet
Using neuralMatch, the images will be compared with a database of child abuse imagery. If any flags are raised, Apple’s staff will take a look at the images manually. Once confirmed for child abuse, the National Center for Missing and Exploited Children (NCMEC) in the US will be notified. The Cupertino-based tech giant said it will roll out the system for checking photos for child abuse imagery ‘on a country-by-country basis, depending on local laws’ at a briefing on Friday, a day after its initial announcement of the project.
This move, however, has been interpreted as creating a backdoor into encrypted messages and services. The Electronic Frontier Foundation, a California-based non-profit, wrote in a blog post: ‘Child exploitation is a serious problem, and Apple isn’t the first tech company to bend its privacy-protective stance in an attempt to combat it. But that choice will come at a high price for overall user privacy. Apple can explain at length how its technical implementation will preserve privacy and security in its proposed backdoor, but at the end of the day, even a thoroughly documented, carefully thought-out, and narrowly-scoped backdoor is still a backdoor’.
Apple’s move has re-ignited the debate over governments and law enforcement agencies seeking a backdoor into encrypted services, and experts are looking for signs that indicate whether Apple has fundamentally changed its stance on user privacy rights.
Post Your Comments