Apple yesterday announced a photo scan tool to search for child abuse cases in the United States. Three measures whose operation we have explained in depth, the details of which have not avoided controversy. The center of it revolves around the possibility that these measures can be reused for any other type of content.
The most obvious use is to curb political dissent and threaten the freedoms of the people in authoritarian countries. Countries like Saudi Arabia, China, Venezuela and other regimes They are the ones who could benefit from these new tools, but what are the chances that this will become a reality?
A first step to scanning any content on iPhone?
It would be enough for Apple to expand the machine learning settings and search for additional content types or change the configuration of alerts to be scanned, including not only children’s accounts but also those of any user so that the backdoor narrow which Apple is building is to widen it.
In this paragraph written by the Electronic Frontier Foundation criticizes Apple’s plans in its encryption to build a backdoor. in our private life. Granted, here are a lot of the issues critics raise with this measure.
The argument is that if Apple can scan photos on the device Before they are encrypted, review the alert and notify an organization, what is stopping you from doing so with messages, links, visited websites and other private content? More when the database with hash to review is provided by a third party, in this case the Child Sexual Abuse Material (CSAM) maintained by the National Center for Missing & Exploited Children (NCMEC). In other countries, there may well be an organization that, in addition to child pornography, sneaks photos of dissident slogans, protests, posters and other subversive content. It doesn’t seem far-fetched.
Whoever controls this list can search your phone for whatever content they want, and you really have no way of knowing what’s on that list because it’s invisible to you (and just a bunch of opaque numbers. , even if you hack your phone to get the list.)
– Matthew Green (@matthew_d_green) August 5, 2021
Johns Hopkins University professor of Christology Matthew Green has been very critical these days with the measures Apple is going to take. And justly report this weak point, in which although we trust Apple, the person in charge of this database is the one who really has the power in their hands. Others, like the professor of security engineering at the University of Cambridge Ross anderson, point out that the measures simply convert mass citizen surveillance into distributed surveillance that runs on devices rather than in the cloud.
These systems have been in operation for over a decade
On the other side of the coin, the truth is that scanning photos for child abuse is nothing new. Yes, this is done from the user’s own device, instead of the cloud. But tech companies have looked at photos for child abuse across hash for over a decade.
We are faced with a sensitive issue, not only because of the type of crime being pursued but also because of Apple’s position in favor of privacy.
Google, for example, uses the hash to identify child pornography since 2008. Microsoft has developed the famous PhotoDNA database which use it hash in 2009, being used in all of its cloud services. It is also used by Twitter, Gmail, Facebook, Adobe, Reddit, and Discord, including the NCMEC itself. Apple has been analyzing photos uploaded to iCloud in the cloud with the same techniques since at least 2019.
AIUI, the database cannot be changed arbitrarily and has been around for over a decade without this sort of thing happening to photos in the cloud. I think this describes a theoretical risk that ignores how authoritarian governments actually don’t need excuses.
– Charles Arthur (@charlesarthur) August 5, 2021
As tech journalist Charles Arthur, who covered this type of technology during his time at The Guardian, recounts, these technologies have been in use for over a decade. And at no time has there been any abuse of them by authoritarian governments. In part and as Arthur explains:
For this scenario to work, the government needs to know the photo that is on the person’s phone and upload a version of it to the database. It’s ridiculous. How do they get them to create the hash? They will stop them just under a pretext. This is what dictators do.
– Charles Arthur (@charlesarthur) August 5, 2021
An authoritarian government does not need excuses to arrest a suspect. He stops her just because that’s what a dictatorship is for. For a government to have the pretext, it would have to know in advance the photo in question and that it is on the user’s device, before uploading it to the database.
Pursuing child pornography should be a priority for tech companies, inform offenders before the authorities and ban the use of their services for their disclosure. Everything seems to indicate that this new method is an additional means of combating this scourge. Based on what we have seen, there are no cases of abuse of these tools, as it does not seem practical for an authoritarian regime.