Whatsapp say that iPhones scanning for indecent images is risky

Image Credit:Photo by Rachit Tank on Unsplash

Third-Party Posts

Summary

An Apple effort to use iPhones to detect child sexual abuse imagery risks paving the way for widescale surveillance, according to rival WhatsApp. 

“I think this is the wrong approach and a setback for people’s privacy all over the world,” Will Cathcart, the head of the Facebook-owned WhatsApp, tweeted on Friday. 

 

For years now, companies including Facebook have been using algorithms to scan, detect, and remove child porn from social media, video sites, and cloud storage platforms. However, Apple said this week that it would use “on-device” processing on the iPhone itself to detect and flag child sexual abuse material (CSAM) as the files are uploaded to an iCloud account.

The on-device processing prompted Cathcart to speak out against Apple’s upcoming system, which arrives in iOS 15

  • Source: PC Mag
  • August 7, 2021
Read Full Story

Leave A Comment

Recommended For You