Ahead of a child safety group's campaign, Apple details why it's dropping iCloud CSAM scanning, saying it could be a "slippery slope of unintended consequences" (Lily Hay Newman/Wired)




Lily Hay Newman / Wired:

Ahead of a child safety group’s campaign, Apple details why it’s dropping iCloud CSAM scanning, saying it could be a “slippery slope of unintended consequences”  —  Child safety group Heat Initiative plans to launch a campaign pressing Apple on child sexual abuse material scanning and user reporting.





Source link

Recommended For You

Avatar

About the Author: rosie

Leave a Reply

Your email address will not be published. Required fields are marked *