Apple continues to work on security and a new security system was recently unveiled. detection of material with aspects of child abuse.
Apple against child abuse
According to a previously mentioned report, Apple would announce a new tool with hashing algorithms for photographic detection of content related to child abuse. All this through a new system that it would be installed on the user’s device and that it would match the images with illegal material to find matches.
This new system would work with fingerprints that do not violate the privacy rules currently used by the company and that would intervene in uploading files to iCloud Photos in order to prevent the spread and backup of inappropriate material.
It is now a reality. Officially, the Apple company has announced a new initiative for the protection of children on iPhone, iPad and Mac. With the phrase: “Protecting children is an important responsibility”. Apple announces 3 new features for this new advance in security.
CSAM detection
By CSAM Apple refers to content that describes sexually explicit activity involving a child. For this he set up a new method of recognition for user content to find matches through the aforementioned hashing process and thereby be able to report CSAM cases to the National Center for Missing and Exploited Children.
“Before an image is stored in iCloud Photos, a matching process is performed on the device for that image against the unreadable set of known CSAM hashes. This matching process is powered by a cryptographic technology called the Private Set Intersection, which determines if there is a match without revealing the result. Private Set Intersection (PSI) allows Apple to know if an image hash matches known CSAM image hashes, without learning about incompatible image hashes. PSI also prevents the user from knowing if there was a match.
So that there is no error, Apple uses a new technology called Threshold, which filters and ensures that the detection is correct and that a fake CSAM find cannot be created. According to Apple data, the system has an error rate of less than 1 in 1,000 billion accounts per year.
messages
One of the most important news for the safety of minors comes with the messages. Now the children who have Apple devices linked to a family through iCloud
That’s right, whenever a miner receives an image that may be sensitive, it will appear blurry and the app will send out a warning notification immediately. In this warning, the minor will be informed why it is inappropriate and whether he agrees to see it The iCloud parent will receive a notification “To make sure you’re okay.”
The same will apply when the minor tries to send a photo considered inappropriate because it is sexually explicit.
Apple mentions that the system begins machine learning that helps determine sensitive content without breaking the end-to-end encryption agreement the service has, so conversations will remain private, but safer for children.
Searches and Siri
Finally the Siri functions and the search for Apple devices have been adapted offer help on the issue of child abuse.
“Apple is also expanding its advice on Siri and Search by providing additional resources to help kids and parents stay safe online and get help in dangerous situations. For example, users who ask Siri how they can report a child or child sexual abuse or exploitation will be directed to resources on where and how to report.
Siri and Search are also updated to intervene when users search for CSAM related queries. These interventions will explain to users that interest in this topic is detrimental and problematic, and will provide partner resources to help them on this topic.
These new features will be implemented in updates to iOS 15, iPadOS 15 and macOS Monterey, will begin in the United States and will expand. as soon as possible to other regions.
There is no doubt that it is a big step for social welfare, for the control and care of minors who use devices and something that could selflessly affect the success of Apple, which today has become the sixth-highest earning company. money today.
Table of Contents