Uncategorized

Apple hampers exposure of CSAM by filing lawsuit against Corellium

There has been a great outcry in the media in the last few days. The reason for this is none other than Apple, which arguably is setting a dangerous precedent for monitoring users with its new scanning methods. Because with iOS 15, two new functions should find their way to all iOS devices: On the one hand, all photos of the users should be automatically checked for child pornography inside (if iCloud is active), on the other hand, iMassage messages should also be checked for adult content.

Apple itself still does not admit mistakes, even after employees and civil rights organizations sharply criticized the move. People like to work with security researchers to find problems and errors in the system. However, this contradicts the legal steps that Apple has been taking against the Corellium tool for several months. Because Corellium enables researchers to search for errors in the first place.

With the new mechanisms, additional content in iMessage could be blocked (Image: Alexander Shatov)

Because in order to be able to put the operating system through its paces, experts have so far either had to jailbreak a retail iPhone or use software for virtualization. Corellium offers the latter and makes such tests much easier. Apple would like to have the tool banned because it violates its own copyright, Corellium claims to offer a product that follows the rules of “fair use”. Either way, a ban could mean a setback for iOS security.

Because although there are also official programs from Apple in which security researchers can use the company’s tools to find and report gaps, the regulations and framework conditions are still very strict and do not exactly strengthen trust in the software. Apple is therefore taking an incomprehensible path and wants to be transparent, but at the same time keep everything under control. That doesn’t go together when it comes to uncovering gaps.

Are photos still safe on iPhones? (Image: Sumudu Mohottige)

With their left hand, they make jail-breaking difficult and sue companies like Corellium to prevent them from existing. Now with their right hand, they say, ‘Oh, we built this really complicated system and it turns out that some people don’t trust that Apple has done it honestly — but it’s okay because any security researcher can go ahead and prove it to themselves. ‘

Own opinion:

I am a little shocked by Apple’s current approach. In the past, the group usually presented itself to the public as a savior of privacy, the CSAM system may have a good purpose, but the disadvantages could outweigh the advantages in the long term. Independent security researchers in particular could help close loopholes, but if their way there is blocked, then Apple can only help itself. So it remains exciting to see whether iOS 15 will come with the new system or whether Apple will row back in the long term.

Via MIT Technology Review

Leave a Reply

Your email address will not be published. Required fields are marked *