Uncategorized

Apple is postponing CSAM scan indefinitely

In the past few weeks, controversial discussions about Apple have flared up. The reason was the planned identification of child pornographic content on the users’ end devices. This system, known as CSAM, was mainly criticized because, compared to the competition, Apple does not scan in the iCloud, but instead scrutinizes all images in the local library (provided iCloud Photos is active).

The problem was not Apple’s actually good intention to protect children and remove problematic material from circulation, but the great potential for abuse of the new scan function. For example, Apple could identify any number of images and report them to the authorities. According to the company, it should never get that far, but skepticism is still appropriate.

Tim Cook has to row back at CSAM

Now the Californian company seems to give in a little. As Apple announced, they will take some time before the function is introduced and respond to public criticism. This means that the system does not come with iOS 15, iPadOS 15 and macOS Monterey as planned, but will probably be delivered with a later update.

Apple did not specify which changes they would like to implement and which feedback was decisive for withdrawing the function. It will therefore be exciting to see to what extent the group would like to convince the public of its mechanisms. Because the security features for iMessage will also be postponed.

The changes planned for iMessage will also be postponed (Image: Alexander Shatov)

These should enable secure communication for the encrypted service, adult content is filtered out with the function and, among other things, parents are informed. It remains to be seen what the changes that are now planned will look like in the end or whether Apple will perhaps backtrack completely after all.

Own opinion:

It was actually to be seen that Apple would change its mind when implementing CSAM. Too much criticism rained down on the company in the last few weeks, at the same time the manufacturer risked its good reputation when it comes to data protection. I think Apple will ultimately backtrack completely and only roll out the CSAM functions on iCloud.

Via 9To5 Mac

Leave a Reply

Your email address will not be published. Required fields are marked *