Apple plans to scan and monitor all US iPhones for evidence of child abuse images; security researchers raise privacy alarm over potential surveillance
Forget about big tech censorship. Now, Apple now wants to monitor all U.S. iPhones for images of child sex abuse. In an announcement today, Apple unveiled plans to scan U.S. iPhones for images of child sexual abuse, drawing praises from child protection groups.
However, Security researchers raise concerns over the potential surveillance of personal devices. They said the system could be misused by governments looking to surveil their citizens.
Today, our smartphones are already acting like tracking devices that broadcast the whereabouts of their users. As you may recall, we reported four years ago about how Android phones help Google tracks your every move even after you turn off the location tracking services.
Known as “neuralMatch,” Apple shared the detail of its proposed system. In a statement on its website, Apple said:
“Apple is introducing new child safety features in three areas, developed in collaboration with child safety experts. First, new communication tools will enable parents to play a more informed role in helping their children navigate communication online. The Messages app will use on-device machine learning to warn about sensitive content, while keeping private communications unreadable by Apple.”
With this new system, Apple, a private company, now wants to track every phone in the United States to “help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material (CSAM).”
Here is how the new Apple safety features would work. The proposed system would alert a team of human reviewers if it finds CSAM is detected on the user’s phone. The human reviewers would then contact law enforcement if the material can be verified. The features will be rolled out as part of iOS 15, expected to be released next month.
“This feature is coming in an update later this year to accounts set up as families in iCloud for iOS 15, iPadOS 15, and macOS Monterey.*”
Commenting on the announcement, Matthew Green, a security professor at Johns Hopkins University said: “This will break the dam — governments will demand it from everyone.” Green is believed to be the first researcher to post a tweet about the issue.
I’ve had independent confirmation from multiple people that Apple is releasing a client-side tool for CSAM scanning tomorrow. This is a really bad idea.
— Matthew Green (@matthew_d_green) August 4, 2021
Initially I understand this will be used to perform client side scanning for cloud-stored photos. Eventually it could be a key ingredient in adding surveillance to encrypted messaging systems.
— Matthew Green (@matthew_d_green) August 5, 2021
Green added in another tweet, “This sort of tool can be a boon for finding child pornography in people’s phones. But imagine what it could do in the hands of an authoritarian government?”
This sort of tool can be a boon for finding child pornography in people’s phones. But imagine what it could do in the hands of an authoritarian government? https://t.co/nB8S6hmLE3
— Matthew Green (@matthew_d_green) August 5, 2021
Green is not the only security expert sounding the alarm bell. Ross Anderson, professor of security engineering at the University of Cambridge said, “It is an absolutely appalling idea because it is going to lead to distributed bulk surveillance of…our phones and laptops.”
Another security researcher named Alec Muffett added: “Apple is walking back privacy to enable 1984,”