For those who came in late, Apple raised an eyebrow or two when it said it was scanning users accounts in the hunt for kiddie porn. While no one was against its paedophile hunting, the Freedom of the Press Foundation is calling Apple's plan to scan photos on user devices to detect known child sexual abuse material (CSAM) a "dangerous precedent".
The concern is that if Apple takes such research upon itself any data could be misused when Apple and its partners come under outside pressure from governments or other powerful actors.
They join the EFF, whistleblower Edward Snowden, and many other privacy and human rights advocates in condemning the move.
Advocacy Director Parker Higgins said privacy invasions come from situations where "false positives" are generated -- that is to say, an image or a device or a user is flagged even though there are no sexual abuse images present.
“These kinds of false positives could happen if the matching database has been tampered with or expanded to include images that do not depict child abuse, or if an adversary could trick Apple's algorithm into erroneously matching an existing image.”
Apple said it is extremely unlikely that its glorious software would flag an innocent image as child abuse material for no reason. Although given that its software cannot manage a change to daylight saving, we would have thought that the Foundation is right to be concerned.
The next issue is that adversaries that can change the contents of the database that Apple devices are checking files against. An organisation that could add leaked copies of its internal records, for example, could find devices that held that data -- including, potentially, whistleblowers and journalists who worked on a given story.
This could reveal the extent of a leak if it is not yet known. Governments that could include images critical of its policies or officials could find dissidents that are exchanging those files.
Apple fanboys who work as journalists have been sold on claims that the outfit offers strong privacy protections. They go on endlessly how Apple refused to redesign its software to open the phone of an alleged terrorist -- not because they wanted to shield the content on a criminal's phone, but because they worried about the precedent it would set for other people who rely on Apple's technology for protection.