in

WhatsApp head says Apple’s child safety update is a ‘surveillance system’


One day after Apple confirmed plans for new software that will allow it to detect images of child abuse on users’ iCloud photos, Facebook’s head of WhatsApp says he is “concerned” by the plans.

In a thread on Twitter, Will Cathcart called it an “Apple built and operated surveillance system that could very easily be used to scan private content for anything they or a government decides it wants to control.” He also raised questions about how such a system may be exploited in China or other countries, or abused by spyware companies.

A spokesperson for Apple disputed Cathcart’s characterization of the software, noting that users can choose to disable iCloud Photos. Apple has also said that the system is only trained on a database of “known” images provided by the National Center for Missing and Exploited Children (NCMEC) and other organizations, and that it wouldn’t be possible to make it work in a regionally-specific way since it’s baked into iOS.

It’s not surprising that Facebook would take issue with Apple’s plans. Apple has spent years bashing Facebook over its record on privacy, even as the social network has embraced end-to-end encryption. More recently, the companies have clashed over privacy updates that have hindered Facebook’s ability to track its users, an update the company has said will hurt its advertising revenue.



Source link

What do you think?

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *

Social Media Policy | Georgetown College

Tesla Inc. stock underperforms Friday when compared to competitors

Tesla Inc. stock underperforms Friday when compared to competitors