The world of technology is particularly concerned with a very sensitive topic, or rather: two very sensitive topics. It is about the prevention and prosecution of child abuse on the one hand and the handling of private data on the other.
What happened? Apple hat three new technical measures announcedintended to protect children. The corresponding software updates will initially only appear in the USA with the operating systems iOS 15, iPadOS 15, watchOS 8, and macOS Monterey in the course of the year.
Since then, there has been a lot of criticism of the innovations, especially with regard to data protection, whereby it is mostly emphasized that Apple’s fundamental concern is not the problem. This is exactly what it is about:
1. Receiving and sending of nude pictures
- Only affects child accounts (can be activated via family sharing) and Apple’s Message app
Machine learning is used on the respective device to check whether a sent or received image is a naked pho to. If the AI algorithm works, the image is made unrecognizable. There is also a warning message as well as information on possible contact points for seeking help. Last but not least, the parents will be notified of the incident if the picture is opened despite the warning and if the child is twelve years old or younger.
2. Search for child pornography
- Only affects photos (not videos) that you want to save via iCloud
Before the upload, the digital fingerprint or hash value of the image on the respective device is compared with the hash values of a database of child pornographic material. The database comes from the “National Center for Missing and Exploited Childern” (NCMEC for short), whereby Apple claims to make the hash values illegible from the outside. If an unspecified limit of hits is exceeded, the affected images are checked manually to investigate a possible false alarm. Only then will the account be blocked and a report sent to the NCMEC.
3. Improved assistance from Siri and the search
If inquiries relating to child abuse and child pornography come up via the Siri language assistant or the search, the corresponding services should take a warning or offer better assistance than before.
Worry about breaking the dam
In most cases, the criticism of the measures is related to a possible expansion of the technology. If child pornographic material can be scanned in iCloud images in this way, then it is just as conceivable that other content can be searched for independently of Apple’s cloud service and that other companies such as Google will follow suit.
If a state is directed against homosexuality, for example, such mechanisms could help to marginalize and discriminate against people with this sexual orientation, an example that is often cited.
Apple defends itself against the allegations: In a meanwhile publish FAQ about the new measures Apple states that it will reject requests from governments. In addition, the scanning process is designed in such a way that only child pornographic material can be recognized.
At the same time, Apple has to adhere to local laws (which is one of the reasons why the new measures are only published in the USA for the time being). Cases like the one recently in the Turkish app store “Hornet” dating app removed from Apple show for homosexuals what that can mean.
How are things going now?
According to the current status, it can be assumed that Apple will implement the new features in the USA as planned in the course of the year. At the same time, an expansion to other countries is likely to depend primarily on legal issues.
Europe Apple customers are therefore not affected for the time being. As long as there are no changes to the measures, US customers can also bypass them if they wish by not making use of iCloud and family sharing.
No comment function?
Since this is a very sensitive topic, as described at the beginning, we are outsourcing the comments to our forum, where we can better moderate.Link to the forum post
Table of Contents