Apple crawls into people’s mobile phones. Would you try this on Android?
Apple has made a very radical (and very controversial) approach to protecting users’ privacy fight against the spread of child pornography. Through a new tool aptly named Child Sexual Abuse Material (CSAM) will be Control photos and videos directly on iPhones. This will exceed the hitherto established limit, when companies used their algorithms and robots to control only the content sent to remote cloud storage. This will now take place directly on the phones. So far in the USA and after negotiations with politicians and authorities, possibly in other countries as well.
The move has, of course, provoked loud criticism, but Apple is trying to reassure the opponent of several arguments. A system that will work by comparison with real sensitive images provided by the National Center for Missing and Exploited Children, should have a very low error rate. It is said that in one of a billion cases, someone could be falsely accused of preserving or distributing child pornography. After the algorithm discovers a suspicious image, the content will be checked by a live person. It is said that there is no danger of punishment, for example, for a photo of one’s own naked offspring in the garden.
Of course, some users also have a lot of resistance to the very fact that someone will look into their mobile phones. To these allegations, Apple said that photos and videos will continue to be encrypted and their inspection will be possible only after reporting by the CSAM system. Employees of the American giant so they will not have free access to the content. In addition to this news, Apple is also introducing a sensitive content filter for chatting on teen mobile phones. Inappropriate photos or videos will be erased by the phone as part of protection, and parents will be notified of the message.
Would you accept something like this on Android?
Source: 9to5