in

No, Apple Isn’t Suddenly Spying On Your iMessages—And This One Switch Stops It Scanning Your Photos


Despite some scare stories suggesting Apple is about to start scanning all your iMessages for evidence of child sexual abuse material (CSAM), the tech giant has made clear it’s not doing that at all.

It is, however, going to scan all photos users upload to the iCloud, using code that compares the “hash” of the image to known hashes of child sexual abuse photos, stored in databases from the likes of the National Center for Missing & Exploited Children . (Think of a hash as a unique numerical representation of an image. It means computers can see if one photo is the same as another. Once a match is discovered, it’ll be checked by a human to ensure that it requires reporting to NCMEC and the relevant police authorities.)

Clearing up confusion

Because Apple is doing that checking on the device, some have called it a kind of backdoor. But Apple has now released an FAQ as it tries to explain why it’s doing this scanning, and to clear up some confusion.

Perhaps the main source of that confusion stems from the fact that Apple announced two technologies in one. Alongside the CSAM scanning, Apple announced that it was going to update iOS with a feature that detects when a nude image is about to be sent from or to a child’s iPhone. This all happens on the phone and Apple never sees the photo, the company explained. The feature will only work on phones that have a child account set up in Family Sharing. “Apple never gains access to communications as a result of this feature in Messages,” Apple wrote, clarifying that the change does not break end-to-end encryption.

“When a child account sends or receives sexually explicit images, the photo will be blurred and the child will be warned, presented with helpful resources, and reassured it is okay if they do not want to view or send the photo. As an additional precaution, young children can also be told that, to make sure they are safe, their parents will get a message if they do view it.”

How to stop Apple scanning your photos

Because the CSAM scanning is only happening on iCloud photos, there’s one very obvious way to prevent Apple checking your images: Turn off cloud storage for your photos.

This is very simple. Go to your settings, scroll down to Photos and flick the switch to iCloud Photos. It’s turned on by default so will need turning off if you’re a new user.

Turning this off, of course, means you’ll have a lot less storage for your photos, unless you have one of the more expensive iPhones with more space.

Apple says it won’t cave to government demands

Apple also sought to allay fears about governments forcing it to detect things that aren’t child sexual abuse images. In particular, concerns have been raised about the potential for China to pressure the tech giant into looking for photo evidence of dissent.

“Apple will refuse any such demands,” the tech giant said. “We have faced demands to build and deploy government-mandated changes that degrade the privacy of users before, and have steadfastly refused those demands. We will continue to refuse them in the future.”



Source link

What do you think?

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *

Fake Vaccine Cards Are In Hot Demand On Social Media, Dark Web

Tax Refund Delays; Green Bitcoin Mining

Tax Refund Delays; Green Bitcoin Mining