August 9, 2021

Apple to Scan for CSAM

Last week, “Apple unveiled plans to scan U.S. iPhones for images of child sexual abuse… The tool designed to detected known images of child sexual abuse, called ‘NeuralHash,’ will scan images before they are uploaded to iCloud. If it finds a match, the image will be reviewed by a human. If child pornography is confirmed, the user’s account will be disabled and the National Center for Missing and Exploited Children notified. Separately, Apple plans to scan users’ encrypted messages for sexually explicit content as a child safety measure.” AP News

Many across the political spectrum worry about the privacy implications of Apple’s new policies:

“We can all agree that CSAM [Child Sexual Abuse Material] is repulsive and should be erased from the internet. I haven't heard anyone argue differently. But, once you make an exception, it's hard to justify not making another, and another, and another. If your argument is ‘we don't have the technology to do that,’ it's pretty easy to resist the pressure to hand over user data. It's a lot harder to make the argument ‘well, we could do that, but we don't think it rises to our standard.’ The problem is, at some point, someone will come along and force that standard with a law or a court order.”
Jason Aten, Inc. Magazine

“Apple can explain at length how its technical implementation will preserve privacy and security in its proposed backdoor, but at the end of the day, even a thoroughly documented, carefully thought-out, and narrowly-scoped backdoor is still a backdoor…

“All it would take to widen the narrow backdoor that Apple is building is an expansion of the machine learning parameters to look for additional types of content, or a tweak of the configuration flags to scan, not just children’s, but anyone’s accounts. That’s not a slippery slope; that’s a fully built system just waiting for external pressure to make the slightest change

“The abuse cases are easy to imagine: governments that outlaw homosexuality might require the classifier to be trained to restrict apparent LGBTQ+ content, or an authoritarian regime might demand the classifier be able to spot popular satirical images or protest flyers.”
India McKinney and Erica Portnoy, Electronic Frontier Foundation

“Even if Apple’s policy were never expanded beyond its original scope (which is unlikely), innocent people would still need to be wary of it. Apple’s algorithm, as with all algorithms, will be prone to generating false positives. Do you want an intimate photo of yours, incorrectly identified as malicious by your phone, to be pawed and pored over by strangers checking it? That is a brazen invasion of personal privacy.”
Robert Schmad, Washington Examiner

“Testing similar systems, [Johns Hopkins University’s] researchers were able to fool the algorithm by sending seemingly innocuous images to other users that still wound up being flagged. Innocent users could still find themselves on a potential sex offender list as a result of such an attack. Apple is claiming that can’t happen, but I hope you’ll pardon me for being a bit skeptical about their openness and transparency on this subject.”
Jazz Shaw, Hot Air

“If the American government attempted to implement scanning systems like this, it would most certainly be understood as an unconstitutional warrantless search. It would violate the Fourth Amendment for the U.S. to scan all our images as we share them to make sure, in advance, they aren't pornographic… What Apple's doing now is designed to appear unobjectionable. But it's creating a framework for serious abuse of surveillance tools down the road.”
Scott Shackford, Reason

“Regardless of what Apple’s long term plans are, they’ve sent a very clear signal. In their (very influential) opinion, it is safe to build systems that scan users’ phones for prohibited content. That’s the message they’re sending to governments, competing services, China, you.”
Matthew Green, Twitter

“Precisely because Apple touted its security, investigative journalists, political activists and human rights workers around the world have opted to use iPhones for their often sensitive and dangerous communications. Amnesty International revealed that the iPhones on which they rely could be compromised by spyware known as Pegasus, and that the vulnerability has existed for at least five years. Perhaps most devastatingly, Pegasus is a ‘zero click’ exploit, meaning you don’t have to click on a dubious message or link to enable it. Simply receiving it in a disguised message will implant the spyware… How are we to stay ahead of constantly innovating spies? We won’t. Act accordingly.”
Firmin Debrabander, Los Angeles Times

Other opinions below.

See past issues

From the Left

“After years of insisting that sorry, no, it just couldn’t help cops and prosecutors trying to catch and convict people guilty of the most heinous crimes because, holding users’ privacy sacrosanct, it had designed technology with ironclad encryption, there’s a crack in Apple’s armor and its argument…

“We thank the tech giant for leaving the land of simplistic absolutes and entering the muddy world of moral tradeoffs, and hope that its awakening soon extends to helping crack devices to make or break cases against accused rapists, terrorists and more…

“Child sexual abuse is indeed a scourge. But adult sexual assault and murder are pretty awful, too, and there are countless cases in which prosecutors struggle and strain to get at evidence locked away on devices. Now that Apple’s shown that privacy isn’t the value that trumps all others, can people who want to protect other victims of other crimes finally get a hearing?”
Editorial Board, New York Daily News

From the Right

“Since Apple first entered the smartphone market in 2007, it has cultivated an image of ultra-privacy, making this move even more curious… This is a company that refused to help the FBI break encryption on an iPhone used by a terrorist who massacred over a dozen people in San Bernadino, California. Now it’s imposing top-down surveillance on the devices of all its users…

“What’s more, Apple’s move came just two weeks after PayPal said it would share its data with law enforcement, and just two weeks after a Big Tech anti-terrorism coalition said it would shift its focus to the far-right…

“We know that the Biden administration is publicly pressuring tech giants to follow its lead when it comes to censoring so-called misinformation. The question is, what kind of pressure is the administration putting on Big Tech behind the scenes? And what else is it demanding from the tech giants?”
Allum Bokhari, Breitbart

Get troll-free political news.

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.