Images of Child Sexual Assault Will Be Scanned on iPhones in the United States by Apple

Child protection groups applauded Apple’s intentions to scan iPhones for photos of child sexual assault, but several security researchers feared the system could be exploited by governments trying to monitor citizens. The technology, named “neutral match,” will scan photos before they are uploaded to iCloud. If it discovers a match, a human will evaluate the image. If child pornography is found, the user’s account will be disabled and the National Center for Missing and Exploited Children (NCMEC) will be alerted.

Apple reportedly wants to analyze encrypted texts for sexually explicit information to protect children, alarming privacy activists. Images already in the center’s database of known child pornography will be flagged. Parents taking photographs of their child bathing should not be concerned. But experts fear the matching technology might be used for more sinister objectives.

How It Can Be Misused

The technology may be used to frame innocent people by sending them seemingly harmless photos engineered to trigger child pornography matches. Apple’s algorithm may be fooled and alert the cops. “Researchers have been able to achieve this rather easily,” said Matthew Green, a cryptography researcher at Johns Hopkins University. Surveillance of dissidents or protestors is another possible misuse.

Exchange of Digital Fingerprints

Microsoft, Google, Facebook, and others have exchanged digital fingerprints of known child sexual assault pictures for years. Apple has used them to check iCloud user files for child pornography, as they are not as secure as on-device data. Governments have long urged Apple to enable more monitoring of encrypted data. When developing new security measures, Apple had to strike a difficult balance between combating child abuse and maintaining its high-profile commitment to user privacy.

But a disappointed Electronic Frontier Foundation termed Apple’s privacy breach “a surprising about-face for consumers who have depended on the company’s leadership in privacy and security.” The computer scientist who created PhotoDNA, the technique used by law enforcement to identify child pornography online, acknowledged the risk of misuse but argued the need to combat child sexual abuse outweighed it.

When Will the Upgrade Be Available?

Apple was one of the first big corporations to use “end-to-end” encryption, in which only senders and receivers can read messages. To investigate crimes like terrorism or child sexual exploitation, law enforcement has long requested access to that data. iPhone, Mac, and Apple Watch users will be able to get the newest upgrade later this year.

Changing the Rules

Per the NCMEC, “Apple’s enhanced kid protection is a game-changer.” The new safety measures might save children’s lives due to the widespread use of Apple products. “The need for privacy is balanced with digital safety for children,” said Julie Cordua, CEO of Thorn Technologies. Demi Moore and Ashton Kutcher’s non-profit Thorn employs technology to identify victims and connect with tech platforms.

Is It Flawed?

That promise of “end-to-end encryption” has been attacked by the Washington-based organization, Center for Democracy and Technology (CDT). Mobile phone or computer scanning for sexually explicit information violates security, it says. The group questioned Apple’s technology to distinguish between harmful information and benign stuff like art and memes. According to CDT, such technologies are notoriously flawed. Apple denies the modifications are a backdoor to its encryption. It claims they are thoughtful developments that respect users’ privacy.

How Will Parents Be Alerted?

Apart from blurring sexually explicit images on children’s phones, Apple’s messaging software may also text younger children’s parents to alert them to the presence of such photos. Their program will “intervene” when users try to seek out child sexual abuse. Parents must enroll their child’s phone to get alerts regarding sexually explicit pictures. Teenagers can opt out of receiving alerts from their parents. Apple claims neither function will jeopardize privacy or alert the cops.