Apple announced the details of its new system to investigate child sexual abuse material (CSAM) on mobile phone users in the United States. Before the image is stored in the cloud image library, this technology will be able to search for matches with known CSAM images.
As early as January 2020, Apple’s senior privacy officer Jane Horvath (Jane Horvath) confirmed that Apple will scan photos uploaded by users to the cloud to look for pictures of child sexual abuse.
Apple said that once a child abuse picture is found, it will be handed over to a back-end reviewer for review, and then the phone user will be reported to the law enforcement agency.
There are already some people who are concerned that this technology may infringe on the right to privacy. Some experts even worry that this technology may be used by centralized governments to monitor their citizens. They worry that the technology can be extended to scan some confidential mobile phones and even political speech.
Apple said that the new versions of iOS and iPadOS (tablet PCs) that will be launched at a later date will have “new encryption software to help limit the online spread of CSAM, and its design also takes into account the protection of user privacy.”
working principle
BBC North America Science and Technology reporter James Clayton (James Clayton) said that the system works by combining pictures of usersā mobile phones with the US National Center for Missing and Exploited Children (NCMEC) and others. Compare this with a database of known child sexual abuse pictures compiled by the Child Safety Organization.
These images are then converted into hashes (hashes, also transliterated as hashes or hashes), which are digital codes that can be “matched” with images on Apple devices.
Apple said the technology will also be able to capture an edited but similar version of the original image.
“High accuracy”
Apple said that this technology will automatically scan and compare the image with a known CSAM hash before the image is stored in the cloud. It is reported that this technology is extremely accurate and ensures that the error rate of each customer account does not exceed one part in a trillion per year.
Apple said that if a match is found, they will manually review each report and provide confirmation. After that, measures will be taken to close the account of the client in question and report to the legal department for processing. According to reports, compared with the existing technology, the new technology provides a “significant” privacy advantage.
Apple said that only when there are multiple known CSAM pictures in the user’s cloud picture account will it cause concern, and this is done to better protect children.
It is also reported that other technology companies including Microsoft, Google and Facebook have also shared hash lists of known child sexual abuse pictures.
Privacy concerns
Despite this, some privacy experts still expressed concern about this voice.
Green, a security researcher at Johns Hopkins University in the United States, said that no matter what Appleās long-term plan is, they have sent a clear signal that they believe it is safe to establish a system to monitor and scan usersā mobile phones for prohibited content.
He also said that it hardly matters whether the final result is right or wrong. The point is that this will open the floodgates-governments will require everyone to do this.