Apple announced new measures for improve your child protection system and fight against child pornography. We explain how these new measures work and what they do not.
Apple has added new measures in collaboration with child safety experts to combat the spread of child pornography. These metrics are divided into three different sections: Messages, Siri, and Photos. The messages will analyze the photos sent and received on the devices of minors, Siri will warn of the search for illegal material and Photos will inform the authorities in the event of detection of child pornography. on our device. How these measures work is quite complex, especially if you want to maintain user privacy, a challenge Apple believes it has successfully overcome and which we’ll explain below.
messages
In Europe it is not widely used, but the Apple Message system is widely used in the United States and other countries. This is why the control of the dissemination of sexual material involving children in Messages is very important and has been one of the pillars of these new measures. The new system will notify children and parents when photographs containing sexually explicit material are sent or received. This will only happen on devices intended for children 12 years of age or younger., with your minor account properly configured.
If a minor (12 years or younger) receives a photo that Apple has classified as “sexually explicit,” it will be blurred and they will be informed in language the minor can understand that the image may not be suitable for them. If you decide to see it (you can see it if you want) you will be notified that your parents will be notified. The same will apply if the minor decides to send a message containing a photograph of a sexual nature.
This process occurs inside the iPhone, Apple does not intervene at any time. The photo is scanned before it is sent or when it is received on the iPhone, and thanks to an artificial intelligence system, it will be decided whether its content presents a risk or not. The notification, if it occurs, will only be received by the parents of the minor (we repeat, 12 or under), neither Apple nor the authorities will be aware of it.
Siri
Apple’s virtual assistant will also be updated to fight child pornography. If someone searches for this type of content, Siri will notify them that the content is illegal and also provide resources that may be useful, such as ways to report this type of content. Again, the whole procedure will take place on our own device., neither Apple nor any competent authority will be aware of our research or the warnings Siri is giving us.
Pictures
This is arguably the most significant change and the one that has sparked the most controversy, with misinformation about how it works. Apple announced that iCloud will detect child pornography images that users have stored in the cloud. If we stick to this statement, many doubts arise as to how this can be done while respecting the privacy of users. But Apple has thought about it and designed a system that allows us to do it without violating our privacy.
More importantly, despite the many articles published with this information, Apple will not scan your photos f or child pornography. They won’t take you for a criminal because you have naked pictures of your son or daughter in the tub
What is CSAM? “Child Sexual Abuse Material” or Child Sexual Abuse Material. It is a catalog of photographs with child pornography content, known and produced by different organizations and whose content is controlled by the National Center for Missing and Exploited Children (NCMEC). Each of these photographs has an invariable digital signature, and this is precisely what is used to know whether the user has these photographs or not.. He will compare the signatures of our photographs with those of the CSAM, it is only if there are coincidences that the alarm will sound.
Therefore, Apple is not going to browse our photos to see if the content is sexual or not, it is not going to use artificial intelligence, it is not even going to see our photos. It will only use the digital signatures of each photograph and compare them with the signatures included in the CSAM, and only if there is a match, it will revise our content. What if one of my photos is mistakenly identified as inappropriate content? Apple assures us that this is practically impossible, but if it did, there would be no problem. First, one match is not enough, there must be multiple matches (we don’t know how many), and if that limit number is exceeded (only if it is exceeded), Apple would look at those specific photographs to assess if it is. is indeed child pornography. content or not before informing the authorities.
For this reason, it is necessary that the photographs are stored in iCloud, since this procedure partly occurs on the device (the comparison of digital signatures) but in case of positive, manual content review is performed by Apple employees by reviewing photos in iCloud
Questions about your privacy?
Any detection system raises doubts about its effectiveness and / or about respect for privacy. It may seem that for a detection system to be truly effective, users’ privacy must be violated, but the reality is that Apple has designed a system in which our privacy is guaranteed. The Messages and Siri detection system does not raise doubts, because it happens inside our device, without Apple knowing anything. Only the photo detection system in iCloud can cause doubts, but the reality is that Apple has taken great care to continue to ensure that our data is only ours.
There would only be one case in which Apple would access our data in iCloud: if the alarm goes off on some of our photos and they have to examine them to see if it is indeed illegal content. . The probability of this happening in error is extraordinarily small, infinitesimal. Personally, I think this highly unlikely risk is worth taking if it helps tackle child pornography.
A back door to access our iPhone?
Absoutely. Apple does not allow access to data on our iPhone at any time. What happens on our iPhone stays on our iPhone. The only point where you can access our data is when we are talking about the photos stored in iCloud, never on our iPhone. There is no back door.
Can I still have pictures of my children?
Without the slightest problem. I have said it several times but I will say it once again: Apple will not scan your photos to see if they contain sexual content involving children. If you have photos of your baby in the tub, that’s okay because they won’t be detected as inappropriate content.. What Apple will do is look for identifiers of photographs already known and cataloged CSAM and compare them with the identifiers of your iPhone, nothing more.
Table of Contents