eXtensions - Monday 16 August 2021
|
Apple Bets the Family SilverBy Graham K. Rogers
The last 10 days have seen Apple trying to walk back some of the criticisms leveled at it over its CSAM detection system particularly by privacy advocates. As was evident early in the online comments, some missteps had clearly been made. For example, last week John Gruber (Daring Fireball) wondered why the three announcements had been made together. This made the situation appear worse than it perhaps was. Apple now seems to agree with this assessment. There was no real lessening of the criticisms online, but significantly the week ended with some big guns and a further document release from Apple which falls somewhere between the initial technical outline and the simpler FAQ. The three documents themselves are significant. Although later articles have been more measured with some comparing how others deal with similar detection of CSAM images, as well as the pressures from regulators, there is still the sense that for a general good (child protection) Apple may have opened a door that could be used for political purposes in the future. While Apple denies this possibility, a number of informed commentators insist that the door is open, while even writers normally positive towards Apple are less sure.
One of the normally partisan commentators on Apple, Jason Snell, MacWorld, expressed his discomfort, not with the CSAM image scanning, but with the way this has all been rolled out, noting that Apple, "doesn't seem to have anticipated all the pushback its announcement received". Explaining the way the technology is to be used, Snell's comments are easy to absorb, but he notes that there appears to be a missing part: "there's another shoe to drop here, one that will allow Apple to make its cloud services more secure and private". Commenting further on the implementation, he notes that the discomfort that others are feeling with this is valid. Although this tool has a specific design, there is always that, "what if?" sometime in the future.
Dean explains the different parts of the image identification systems clearly, but when examining the iCloud Photos aspect and the use of hashes outlines the fear that many have that Apple may not be able to stop the expansion to other areas as it has claimed: refusing to do something with certain governments may not be enough. We have seen the way that China has forced Apple (and Google) to make changes; and recently Apple had to ensure specific apps were on the iPhones sold in Russia.
Now the staff are unhappy with the CSAM image implementation, particularly that the "feature could be exploited by repressive governments looking to find other material for censorship or arrests". Other expressed opinions match some of the comments from the outside reports about the worry that Apple has opened a door and it cannot be shut.
The paragraph ends with, "I am frankly stunned that Apple didn't understand that any reduction of privacy, however minor it may be, and however good the reason, was going to create massive waves." He looks further at the missteps and notes, like others, that the announcement was severely mishandled. Also like others he puts forward the question about whether this could be related to a future announcement on encryption. If so, he comments, this was done in the wrong order. Lovejoy had amended his original content to reflect the change in the Force that occurred when an interview by Joanna Stern with Craig Federighi on the confusion, was put out by The Wall Street Journal (my original link - Benjamin Mayo, 9to5Mac).. This is a recognition by Apple that much more work needs to be done by the Corporation to assuage the fears of users (and critics) over the announcement. By rolling out Federighi who has a good public image, it is clear that Apple knows it has stumbled. He acknowledges this by "we wish that this had come out a little more clearly, because we feel very positively and strongly about what we are doing, and we can see that it has been widely misunderstood".
An interesting point was the threshold that Apple intends to use before there is human intervention. The original technical document had hypothesized that the algorithm would elevate the examination with 10 images, and other speculation suggested 5, but Federighi states that the threshold is 30 images. To me that clears any possible suggestion that this was an accidental download: with 30 images or more, the user is a collector. Stern points out that Facebook, Google and Microsoft already access the NCMEC image database and scan the images, but Apple wants to do this using software for privacy reasons. When Stern tackles Federighi about "coming on to my device" he responds that this is a common misunderstanding and explains that what happens is part of a complex process with separate actions in "the pipeline" from Library to iCloud. She then asks if this is a back door, repeating the fears that so many have about this, but Federrighi is adamant: in no way is this a back door, adding that he really does not understand this characterization. This sounds almost naive when he says it with some exasperation. However, he insists that the question that many have about a future request from a government to identify other messages or images, is rejected because the multiple levels in the process pipeline protect the user (and the device) from any such incursions.
Graham K. Rogers teaches at the Faculty of Engineering, Mahidol University in Thailand. He wrote in the Bangkok Post, Database supplement on IT subjects. For the last seven years of Database he wrote a column on Apple and Macs. After 3 years writing a column in the Life supplement, he is now no longer associated with the Bangkok Post. He can be followed on Twitter (@extensions_th) |
|
For further information, e-mail to
Back to
eXtensions
Back to
Home Page