eXtensions - Saturday 7 August 2021
|
Saturday Comment: Apple Initiative on Child Safety: Pass/FailBy Graham K. Rogers
In the early 1970s I took a call at a police station from a concerned person. I was asked about the showing of pornographic movies: that ended with a raid and several people were arrested. The 16mm movie arrived later and the next evening was run by the officer in the case to provide evidence: I have seen the film and it is pornographic - likely to deprave and corrupt. For some odd reason, cars from all over the county were in the area on enquiries and just dropped in for a break. It was actually fairly mild but was clearly pornographic. The performers were also certainly in their late teens and 20s. With other media, such as photographs and magazines, even in the days before widespread use of electronic distribution, there was a lot going around. The internet has perhaps expanded that - although I am not totally convinced - as it has made certain materials more widely available. I read news from many different countries each day and it is clear that several times a month the police have enough evidence to convict scores of people for possession of images of child pornography, bestiality and child cruelty. Apart from the initial contact regarding that movie, and the arrest of a pair of teenage brothers who had been stealing ladies underwear from local washing lines, my exposure to anything like that has been limited, but I would know it if I saw it.
I have read scores of comments on Twitter and other online sources that were critical not just of the search for CSAM but Apple's apparent cavalier attitude to user privacy, something that it has made its own fight in recent years, winning much respect for this. It now seems as if this is to be jettisoned. As Apple has previously claimed about encryption, once you open a door for the good guys it is open for everyone. Edward Snowden wrote, "No matter how well-intentioned, Apple is rolling out mass surveillance to the entire world with this. Make no mistake: if they can scan for kiddie porn today, they can scan for anything tomorrow" (Stephen Warwick, iMore). These CSAM checks cannot be run if Photos on iCloud is turned off. That also means these checks cannot be run on the devices (synchronization infers that a photo in iCloud is also on the device). Nor can iCloud backups be checked. Presumably these are compressed or in a specific file format, as a Time Machine backup on a disk would be. Those who are determined to store such images on their devices will simply look for new methods to do this, or seek out other ways to backup their data. The EFF was among a number of organizations who were critical of the approach. Dorset Eye mentions several critics in its overview of Apple's announcement and a number of others who support the move. The consensus seems to be that CSAM is not good, but opening this door will lead to other demands being made and an erosion of trust, particularly in Apple and its devices. Somehow, Apple is going to have to walk this back carefully. MacDailyNews, normally one of the more partisan Apple sources, in reporting the EFF comments writes, "Apple must have been placed in an untenable situation to introduce this backdoor, destroying their vaunted claims to protecting privacy, or Tim Cook has completely lost the plot." The report enlarges on the idea of external pressure bringing this about, but the writer is not convinced, ending with "We expected Apple to be better. Apple failed."
The process that does this is complex and involves two layers of encryption. If an image is in the iCloud library it will go through this process which assigns a unique numerical identifier to a suspect image and does not analyze the image. A single image is not enough to trigger action when "someone at Apple will examine the contents of the safety vouchers for those flagged images before reporting the incident to law enforcement" [my italics]. Gruber notes that the whole package is grouped under a safety banner but wonders if announcing the whole package together was an error. This is a valid point as users can easily understand the need to prevent undesirable communication with children (e.g dick pics, et al). Including the image library into the same announcement has elevated it (perhaps) to a more political level: perception is all; and Apple should have been sensitive to this. This lengthy article is worth examining carefully, particularly Gruber's comments on the potential for features to come, that others like Google or Facebook may well take up. Apple could also include end to end (E2E) encryption of the photo library itself which is currently unavailable although other E2E services are. This is speculation but makes much sense. He notes, however, that despite these inbuilt safeguards, we are "still left with completely legitimate concerns from trustworthy experts about how the features could be abused or misused in the future."
The idea put forward by Apple may be noble but the way this has been presented - and the perceptions - will not leave Cupertino unstained after its previous championing of privacy. Whether this is a good idea is still in the air, but it all leaves a nasty taste in the mouth. Heads will roll.
Other Related Links
Graham K. Rogers teaches at the Faculty of Engineering, Mahidol University in Thailand. He wrote in the Bangkok Post, Database supplement on IT subjects. For the last seven years of Database he wrote a column on Apple and Macs. After 3 years writing a column in the Life supplement, he is now no longer associated with the Bangkok Post. He can be followed on Twitter (@extensions_th) |
|
For further information, e-mail to
Back to
eXtensions
Back to
Home Page