Quick News Bit

Apple’s new stance on iPhone privacy is troubling – and where is Tim Cook, by the way?

0

So much has been said and written already about the new Apple child safety features that the iPhone maker unveiled in recent days — features that represent an attempt to crack down on child exploitation. And on the usage of Apple devices to disseminate child sex abuse material (or, CSAM). Apple laid out its plans and the technical descriptions for how the new protections will work with the arrival of iOS 15. And the company followed up with a FAQ that responds to (or, rather, is advertised as a response to) some of the many, many concerns of the privacy, security, and cryptography communities.

Here, though, is what I keep coming back to. Yes, these new Apple child safety features attack a problem that urgently demands a solution. How bad is the problem? This should tell you all you need to know: Google reported 546,000+ instances of CSAM to the National Center for Missing & Exploited Children every single day last year. For Facebook, the number topped 20 million. Be that as it may, though, something else can also be true at the same time. That Apple’s communication about all this leaves a lot to be desired.

Today’s Top Deal Fire TV Stick 4K just got a rare 20% discount — don’t miss out! Price:Was $50, Now $39.99 Buy Now Available from Amazon, BGR may receive a commission Available from Amazon BGR may receive a commission

apple iphone privacy commercial

Apple, and privacy

Before we dive into a deeper discussion about the Apple child safety features, though, let’s first rewind the clock. More than five years back into the past — to February of 2016.

A federal judge has just handed Apple an order, demanding that the iPhone maker help the FBI unlock the iPhone of a dead terrorist, following the San Bernardino attack. Doing so would have required Apple engineers to produce a piece of code that essentially amounted to a Golden Key. Something that represented a serious enough threat to users’ privacy that Apple CEO Tim Cook himself got involved. He denounced the judge’s order, rightfully warning that creating a backdoor into one iPhone would instantly make every iPhone in existence less secure.

The FBI was eventually able to unlock the terrorist’s iPhone without Apple’s help. But let’s pause and remember what that series of events entailed. Apple was ordered, not asked, by a judge to help the FBI do something good. Good, that is, at least in an isolated sense (getting inside a terrorist’s phone to help with an investigation). But Apple resisted. Why? Because of a hypothetical terrible scenario that lay at the bottom of a treacherously slippery slope. That scenario being the erosion of every Apple user’s privacy, as the cost of investigating that one dead terrorist.

Apple child safety features – a primer

Here in 2021, meanwhile, the company talks about all this much differently now. In a way that is utterly terrifying to privacy experts.

Without getting too deep into the technical weeds, the company is essentially tying every Apple user’s iCloud Photos account to a database of child exploitation material maintained by the NCMEC. It’s the arrival of iOS 15 that will set all this in motion. And once the new Apple child safety features are live, if ever the photos in your account “match known CSAM images,” according to Apple, the enforcement mechanisms of this new security paradigm from Apple can kick in.

Now, in the FAQ Apple released that’s supposed to reassure people about all this? Apple seemed to acknowledge that some people might see a slippery slope herein. In the document, Apple proposed a question that such people might ask. “Could governments force Apple to add non-CSAM images to the hash list?”

The slippery slope

In other words, could a country like China come to Apple and say, you know what, since you all already have a system in place to take action on child porn as informed by a database of specific material, we want to give you another database of material. Instead of CSAM, though, we want you to search for and take action whenever you find something different. That something we want you to look for? Users with non-government-approved content on their devices.

Apple’s response? From its new document: “Apple will refuse any such demands. Apple’s CSAM detection capability is built solely to detect known CSAM images stored in iCloud Photos that experts at NCMEC and other child safety groups have identified. We have faced demands to build and deploy government-mandated changes that degrade the privacy of users before, and have steadfastly refused those demands.”

Did you catch the subtle shift in tone?

Five years ago, Apple didn’t even want to build a particular capability because of the awesome power it might give to federal authorities. Now, Apple is building a capability anyway. And the company’s response to the potential future misuse of this new capability is “We’ll do our best to resist.”

I caught up briefly on Monday with Matthew Green to talk about all this. He’s a Johns Hopkins University professor who’s worked on cryptographic technologies. And he agreed with me that Apple is being a little intellectually inconsistent with how it’s handled privacy implications in the past versus what could happen here. “Now,” Green told me, “they’re saying don’t worry. Trust us. We promise to keep this technology from use more broadly … But it absolutely can be repurposed for bad things.”

The China question

You can roll your eyes all you want to at anyone making the slippery slope argument here. But the fact of the matter is that when it comes to the new Apple child safety features, the hypothetical is the whole ballgame.

In a call the company hosted with journalists on Monday to talk further about some of these issues, one journalist asked whether Apple might leave China if it ever came to this. If China, that is, ever made an offer that Apple couldn’t refuse. The answer given was not at all reassuring: Apple’s response was, well, this system … is not launching in China right now.

Tim Cook’s silence on all this, by the way, is pretty deafening. He’s been at the vanguard of the company’s attempt to position itself as privacy-focused. Even though its actions at times, especially in China, can seem so incongruous.

Apple vs China

Remember Apple’s promise to “refuse” demands for a wider implementation of the child safety features? Apple has actually been there before in China. In terms of the Chinese government forcing it to alter its security practices.

One example came back in 2018. Chinese authorities strong-armed Apple into doing something new. For the first time, Apple would start making the keys for Chinese iCloud accounts available inside China. Before, Chinese officials would have to go through US courts to obtain information about Chinese iCloud users. Now, they can breeze through their country’s own legal system. “While we advocated against iCloud being subject to these laws, we were ultimately unsuccessful,” Apple said in a statement at the time.

So much for that steadfast refusal.

How about when Apple unveiled its new “private relay” system during WWDC 2021? Basically, it’s a new feature that will mask the web browsing behavior of Internet users from ISPs and advertisers. But this new system, sigh, isn’t coming to China. As Facebook’s former chief security officer Alex Stamos tweeted sarcastically: “We believe privacy is a fundamental human right. *Offer not valid in the People’s Republic of China.”

Pandora’s box

Maybe there’s nothing to worry about here. Maybe the more than 6,600 (at last count) privacy and security experts who signed this letter in opposition to Apple’s plan are overwrought. Or, perhaps a Chinese intelligence agency can pay off an Apple employee to do something untoward once these new child safety features are in place. And maybe a government, somewhere down the line, will successfully coerce Apple into expanding this system into something much more dystopian.

Your guess is as good as mine as to how this all plays out. Right now, all we’ve got to reassure us are Apple’s promises. And the backstops built into Apple’s new system.

The thing about Pandora’s box, though, is that there’s a super-easy way to avoid the contents of the box spilling out into the world by mistake. It’s to never build that box in the first place.

In the end, “it really doesn’t matter what Apple’s claimed process protections are,” tweeted former NSA whistleblower Edward Snowden. “Once they create the capability, the law will change to direct its application.”

Today’s Top Deal 88,000+ Amazon shoppers love these luxurious bed sheets that keep you cool at night! Price:Was $57, Now $34.95 Buy Now Available from Amazon, BGR may receive a commission Available from Amazon BGR may receive a commission

For all the latest Technology News Click Here 

 For the latest news and updates, follow us on Google News

Read original article here

Denial of responsibility! NewsBit.us is an automatic aggregator around the global media. All the content are available free on Internet. We have just arranged it in one platform for educational purpose only. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials on our website, please contact us by email – [email protected]. The content will be deleted within 24 hours.

Leave a comment