Apple keeps silent about future of CSAM

It’s been a year since Apple introduced its CSAM, or Child Sexual Abuse Material protection on iCloud Photos, but so far the company has not announced anything about where it’s headed and if there will be new measures implemented.

Currently, iCloud photos blurs sexually explicit photos in Messages and Siri child exploitation resources. Apple mentioned that CSAM detection will be enabled on iPadOS 15 and iOS 15 by 2021’s end. However, it was postponed due to ‘feedback from researchers, advocacy groups and customers’.


In September last year, Apple posted an update for its Child Safety page, saying that although the company intended to help protect children from predators and limit the spread of Child Sexual Abuse Material. The following December, Apple removed the post. A spokesman stated that Apple will continue to pursue options but has not made any public comments after that.

CSAM was criticized by policy groups, politicians, the EFF, security researchers and university researchers, among others.