As an Amazon Associate I earn from qualifying purchases from amazon.com

WhatsApp says Apple’s Child Safety tools are a dangerous ‘surveillance system’

[ad_1]

Facebook is continuing its war of words with Apple, with the head of the company’s WhatsApp chat app taking aim at Apple’s newly-announced Child Safety features.

In a lengthy thread on Twitter, WhatsApp’s Will Cathcart said he was “concerned” about the approach, which will include scanning iPhone users’ photos to check for child sexual abuse material (CSAM) before they are uploaded to iCloud.

Cathcart said the new feature amounted to a “surveillance system” and hit out at a software that can “scan all the private photos on your phone.” He claimed the system could eventually be a back door for governments to spy on citizens, something Apple has vehemently opposed in the past.

The WhatsApp executive said: “Instead of focusing on making it easy for people to report content that’s shared with them, Apple has built software that can scan all the private photos on your phone — even photos you haven’t shared with anyone. That’s not privacy.”

He went on to say: “This is an Apple built and operated surveillance system that could very easily be used to scan private content for anything they or a government decides it wants to control.”

In an explainer on Friday, Apple said it had built tech that can scan photos earmarked for iCloud uploads on the device, in a manner that protects user privacy.

The firm said: “Before an image is stored in iCloud Photos, an on-device matching process is performed for that image against the known CSAM hashes.

“This matching process is powered by a cryptographic technology called private set intersection, which determines if there is a match without revealing the result. The device creates a cryptographic safety voucher that encodes the match result along with additional encrypted data about the image. This voucher is uploaded to iCloud Photos along with the image.”

The features also include new image recognition tools in iMessage and guidance within Siri and Search pertaining to CSAM material.

While the features may help in identifying the offending and illegal material and bringing perpetrators and abusers to justice, it’s clear there is widespread concern over the approach and the potential for collateral damage. Apple has long held the high ground over companies like Facebook when it comes to user privacy, but it may be at risk of ceding some with the new Child Safety tools.

Cathcart added: “There are so many problems with this approach, and it’s troubling to see them act without engaging experts that have long documented their technical and broader concerns with this.”

The entire thread is certainly worth a read. Cathcart defended WhatsApp’s approach saying it was able to report a worrying 400,000 cases to the authorities without breaking encryption.



[ad_2]

We will be happy to hear your thoughts

Leave a reply

Kin Abelle Flats
Logo
Enable registration in settings - general
Compare items
  • Total (0)
Compare
0