Android

Apple will soon scan iPhone photos for child abuse

From iOS 15, Apple uses special software to scan photos for images of child abuse. When something suspicious is detected, the software sounds an alarm and the images are checked by a team of researchers.

Read on after the ad.

iPhone Photos: Scan for Child Abuse

The Cupertino company announced the plans – which have already caused a lot of fuss – in a press release on Thursday evening. At launch, only photos from users in the United States will be checked. More countries should follow later.

Apple uses special software that scans an iPhone user’s photos before uploading them to iCloud. This software is called NeuralHash. The program compares the user’s photos with a database from the National Center for Missing and Exploited Children (NMEC).

The NMEC is an agency that acts as a child abuse reporting center in the US. If photos of an iPhone user match known images of child abuse, Apple will inform the NMEC. Apple may also decide to view more images of the user.

Is privacy at stake?

The introduction of the software is striking, because Apple attaches great importance to the privacy of the user. ‘What happens on your iPhone, stays on your iPhone’ is a well-known saying. Still, Apple emphasizes that the software is designed with privacy in mind.

Scanning is done locally on a device. Apple would convert the NMEC database into unreadable code and store it on the user’s iPhone or iPad. The NeuralHash software then uses cryptographic techniques (called Private Set Junction). This determines whether there is a match, without revealing the result.

apple sells privacy

Despite the good intentions, privacy experts are concerned. “Even if you’re confident that Apple isn’t abusing these tools, there’s still a lot to worry about,” US security researcher Matthew Green tweeted. “Imagine if the software ends up in the hands of authoritarian regimes.”

Green fears that the Apple will inspire other tech giants to adopt similar techniques. “The fence is now off the dam,” he writes. “Governments will demand it of everyone.” Incidentally, Apple is not the first technology company to scan photos for child abuse. Google, Twitter and Facebook, among others, use similar hashing methods to find and report child abuse photos.

Want to stay up to date with the latest Apple news? Download the iPhoned app or sign up for the newsletter.

Leave a Reply

Your email address will not be published. Required fields are marked *