Uncategorized

No child porn scan on Apple iPhones for the time being

In early August, Apple announced a radical move against child pornography. The company wanted to search for the relevant material directly on its customers’ iPhones. Now Apple is rowing back.

The announced technology for the detection of potential child abuse on the iPhones of millions of users had led to worldwide criticism. Even Edward Snowden had spoken out, suggesting that the technology, once established, could be used for all sorts of other unpleasant content. In an internal Slack channel, Apple employees are also said to have expressed themselves critical of the planned area scan of all photo media libraries of all iPhones in the USA.

Apple wants to take its time and revise functions

In connection with these plans, documents that were used by the court in the Epic proceedings against Apple also revealed that Apple had been scanning the e-mails of its iCloud Mail service for abusive materials for years.

Now Apple wants to “take additional time” to refine the functions before they are presented to the public. Opposite to 9to5Mac the manufacturer said:

“Last month we announced plans for features to help protect children from malicious actors who use communication tools to solicit and exploit them, and limit the distribution of child sexual abuse material. Based on feedback from customers, stakeholders, researchers, and others, we’ve decided to take more time in the coming months to gather ideas and make improvements before releasing these very important child protection features. “

Anyone who hopes that Apple could give up the plans should read again more closely. The group continues to describe the planned functions as “extremely important”. Apparently it is really only about a technical and communicative revision to avoid another shit storm. In addition, researchers had already recreated and tricked the core component of the photo scan. It is possible that Apple will also want to work on the technical reliability again.

Almost finished!

Please click on the link in the confirmation email to complete your registration.

Would you like more information about the newsletter? Find out more now

There is no new schedule

Apple’s new child safety features should roll out as part of updates to iOS 15, iPadOS 15, and macOS Monterey later this year. This should no longer be expected for the time being. Apple has not communicated a new schedule. There is also little information about what Apple specifically wants to change before the functions are due to be introduced again.

The other child safety functions announced by Apple last month and now also postponed included a feature that parents should send a warning message if their child receives or sends nude photos in Apple’s iMessage chat service.

This is how Apple’s photo scanning technology should work

Apple’s method of detecting material that depicts child abuse, so-called CSAM (Child Sexual Abuse Material), is designed to protect users’ privacy. Instead of scanning images in the cloud, the system does a reconciliation on the device using a database of known CSAM image hashes provided by the National Center for Missing and Exploited Children NCMEC and other child protection organizations. Apple converts this database into an illegible set of hashes that are securely stored on users’ devices.

Before an image is saved in iCloud Photos, it is compared internally with the known CSAM hashes. A cryptographic technology called “Private Set Intersection” is used to determine whether there is a match without revealing the result. If it matches with additional encrypted data about the image, the device creates a cryptographic security certificate that is uploaded to iCloud Photos together with the image. These certificates can then be opened via the cloud and subjected to a further check.

You might be interested in that too

Leave a Reply

Your email address will not be published. Required fields are marked *