Android

6 Questions Answered About Scanning iPhone Photos for Child Abuse

A feature announced by Apple, which scans iPhone photos for child abuse images, has sparked a stir. Will all photos be scanned later? And is this the beginning of more control over users’ photos? Apple answers in a FAQ.

Read on after the ad.

FAQ about scanning iPhone photos

This week, Apple announced that photos stored in iCloud will be scanned for child abuse images in the United States. More countries will follow later. Apple does the scanning with special software – called NeuralHash. He compares the photos with a database containing images of child abuse.

By the way, the program does not put pictures next to each other, but hashes. This is a kind of digital fingerprint that every photo has. If the hashes (a type of code) match, Apple will be notified and the company will alert the authorities.

Despite the good intentions, critics took to the pen. Authoritarian regimes could force Apple to use the software for other purposes. For example, to track down people with politically dissenting opinions. And iPhone users wonder if their photos are still private.

To allay the concerns, Apple creates clarity in an FAQ. We summarize the most interesting questions and answers below.

1. Does Apple scan all photos stored on my iPhone?

New. The feature only applies to photos that the user uploads to iCloud Photos. And even then, Apple is only informed about accounts that have known CSAM (Child Sexual Abuse Material, ed.) in the library. The system will not work for users who have disabled iCloud Photos.”

2. Will CSAM images be downloaded to my iPhone to compare with my photos?

New. CSAM images are not saved or sent to the device. Instead of real images, Apple uses unreadable hashes stored on the device. These hashes are strings of numbers that represent known CSAM images, but it is not possible to read or convert those hashes to CSAM images.

The sets of hashes are based on images obtained by agencies and validated as CSAM. By using new cryptographic techniques, Apple can use these hashes to learn about iCloud Photos accounts with known CSAM material. Apple only detects photos that are familiar with CSAM, without learning about other photos.

3. Can the CSAM discovery be used to detect things other than CSAM?

Our process is designed to prevent that. CSAM detection for iCloud Photos is built to work only with CSAM hashes provided by NCMEC (a child abuse hotline) and other child safety organizations.

There is no automated reporting to authorities; Apple conducts a human review before reporting to NCMEC. In most countries, including the United States, simply possessing CSAM is a crime and Apple is required to report all cases to the appropriate authorities.

4. Can governments force Apple to track down other types of photos?

Apple will decline such requests. We’ve had to deal with government demands that compromise user privacy before, and we’ve steadfastly rejected them.

We will continue to refuse them in the future. Let’s be clear: this technique is limited to detecting CSAM in iCloud Photos, and we won’t respond to requests to extend it.Apple privacy

5. Will CSAM Detection in iCloud Photos report innocent people to the police?

New. The technique is designed to be very accurate and the chances of the system incorrectly flagging an account as suspicious are less than one in one trillion per year.

In addition, Apple conducts a human review before reporting to the NCMEC. As a result, system errors or attacks will not result in innocent people being reported to the NCMEC.

6. Will the CSAM detection also come to the Netherlands and Belgium?

The CSAM detection will initially only work in the United States, but the tech giant wants to expand this to more countries later.

It remains to be seen whether the detection of iCloud photos for child abuse will soon also happen in these regions of the world. This has everything to do with the AVG, the European privacy law. ‘In Europe we have the AVG’ as it stands now, it puts a stop to this,’ Peter Krager reported to BNR, a specialist in privacy law.

Apple should not be allowed to use the software here, because the tech giant first checks the photos itself before the authorities are called in. And that’s not allowed. ‘A private organization that helps with enforcement, such as Apple, could, but then there would have to be legislation for this,’ Krager said.

Want to stay up to date with the latest Apple news? Download the iPhoned app or sign up for the newsletter.

Leave a Reply

Your email address will not be published. Required fields are marked *