Apple to Unveil Feature to Detect Child Porn

08.13.21
Apple to Unveil Feature to Detect Child Porn (Photo: Chuong Le [LeSy])

Apple plans to introduce a variety of child safety features to curb child sexual abuse across Messages, Photos and Siri with the release of iOS 15 and iPadsOS later this year.  

Apple’s new plan will use technology to identify illegal images that users are uploading to iCloud without Apple directly looking into users’ photos. 

The company plans to implement new software tools in iOS and iPadOS that will allow the company to detect when someone uploads content to iCloud that shows children involved in sexually explicit acts. The company will also use the technology to notify the National Center for Missing and Exploited Children, which will work with law enforcement agencies across the US. 

Users can appeal a suspension if they believe their account has been mistakenly flagged.    

The Messages app will also include a new parental control option to Messages, obscuring sexually explicit pictures for users under 18 and sending parents an alert if a child 12 or under views or sends these pictures.

Siri, as well as the built-in search feature found in iOS and macOS, will direct users to child safety resources online if a user searches for topics related to child sexual abuse. 

Support the Next Generation of Content Creators
Invest in the diverse voices that will shape and lead the future of journalism and art.
donate now
Support the Next Generation of Content Creators
Invest in the diverse voices that will shape and lead the future of journalism and art.
donate now