Apple releases document to address and clarify doubts about its system to address child abuse

oriXone

Apple releases document to address and clarify doubts about its system to address child abuse

abuse, address, Apple, Child, clarify, document, doubts, releases, System

New measures to protect children from abuse submitted by Apple last week they caused quite a stir and controversy. In Applesfera, we explained in detail how these analytics metrics work in iCloud and Messages, but now Apple has wanted to fix a lot of issues with a new document. This is a question and answer on extended protection for children on Apple platforms.

Measures to keep children safe and prevent the dissemination of abusive material

At Apple, our goal is to create technology that empowers people and enriches their lives, while helping them stay safe. We want to protect children from predators who use communication tools to recruit and exploit them, as well as limit the dissemination of child pornography (CSAM).

This is how the Apple document begins, in which we see a short intro where it’s clear that iMessage security and CSAM hardware detection are two different things and use different technology. The company says that in the first, the images sent and received in iMessage are scanned within the device, only for devices set up for children in a family account.

The second is independent of the first and focuses on detecting photos uploaded to iCloud that match CSAM material, not applying to users who have not enabled photos in iCloud. The document it is quite detailed and complete, so it is worth reading it in its entirety. Next, we’ll try to synthesize it as best as possible, although it’s always advisable to take a look at the original.

IMessage communications security

messages

  • Applies only to Apple accounts defined as families in iCloud.
  • Parents must activate it for their family group and they are the only ones who can receive message alerts for children 12 and under.
  • iMessage will not share information with Apple, NCMEC (The United States Association for the Protection of Children), or the police.
  • Message encryption, privacy and security are not brokenbecause Apple never has access to communications. The user remains in control of their communications and Apple does not intervene at all, not even in notifications of sexual content to children under 12.
  • For children aged 13 to 17, parents will not receive notifications since this feature is only present in children 12 years of age or younger. They are the ones who will decide whether they want to see or send an image and in doing so, their parents will receive a notification.

Detecting CSAM Abusive Material in iCloud Photos

encryption

  • For iCloud photo scanning, Apple only applies it to photos the user uploads to iCloud. About them, Apple only receives alerts for images that match already known CSAM content.
  • It does not apply to users with iCloud Photo Library turned off and does not work on photos stored locally in iPhone Photo Library.
  • In any case, CSAM images are not downloaded to the device to be able to carry out the analysis, it suffices hash of photos that are CSAM. A chop is a string of numbers that represent those images, which cannot be read or reconstructed, which come from verified CSAM content.
  • The analysis takes place on the device itself, where these are compared hash with photos uploaded to iCloud. In no case does Apple know or see which photos are uploaded to iCloud.
  • There are other companies that digitize all photos in the cloud. Apple’s method protects privacy because it only focuses on photos that match a CSAM image and are included in an iCloud library.

Other security issues regarding CSAM photo detection

iPhone

One of the complaints or concerns about the new system announced by Apple is whether it could be used for detect things other than CSAM images. An alert always appears when several photos are positive compared to the hash CSAM and this will be analyzed by individuals before reporting them to the US Child Abuse Center. Therefore, only photos that correspond to child pornography material will be reported to this organization.

Many have argued that governments could force Apple to add non-CSAM images to its list. The Cupertino company claims that will refuse such requests, if they occur. He assures that in the past he has directly opposed requests from governments to create backdoors on their devices, clearly referring to the San Bernardino terrorist’s iPhone case and his confrontation with the FBI.

Apple says the system will enter service in the United States and will explore the possibility of expanding it to other countries and regions

The company ensures that its system does not allow injecting images to trigger alerts, because the hash They are stored in the operating system of the iPhone and the iPad. There is no possibility of carrying out an attack on a particular individual and we must remember that each report is then reviewed manually by Apple. Similarly, it is not possible to wrongly accuse innocent people since the system has an error of 1 in 1,000,000,000,000. If this happens, manual review will eliminate it.

A controversy that could have been avoided with good communication

photos in iCloud

Of course, the steps Apple will be implementing with iOS 15 are impressive. The company has gone to great lengths to maintain privacy, encryption and security in communications and photos, while creating a system to recognize CSAM material and another to prevent sexual communications with minors. Yet and all, the controversy was quick to spring up given Apple’s position in favor of the privacy and security of its users.

This strong demand that we have on these issues with the company has been deserved. But it is also an indicator that this case should have been treated differently. Of course, allowing this to be leaked to the press, only to announce it out of the blue, was not a good idea. Now the company has been forced to comment here and there, finally releasing this document which resolves many legitimate questions from users.

We are facing a blunder in terms of communication, which has allowed to arouse fear and disinformation on a very demanding and delicate subject.

It would have been enough to select a handful of journalists to those who count all these measurements in advance. Disclose information both officially and through these means. As is the case with embargoes on the analysis of new products.

How the new child abuse content analytics measures work in iCloud and Messages

It’s likely that Apple has thought about doing something similar as we approach the launch of iOS 15 and other operating systems. Thinking that at the moment it was not necessary. What has become clear is that when it comes to privacy and security, they cannot walk behind events.

Leave a Comment