Apple’s new Child Sexual Abuse Material (CSAM) detection feature which is set to release in the near future, has garnered the attention of the whistleblower Edward Snowden. In his latest editorial piece on Substack, Snowden has said that the company’s strategy to tackle child abuse material is a ‘tragedy’.
Snowden argues that the new implementation of CSAM features on Apple devices will ‘permanently redefine what belongs to you, and what belongs to them’. The feature, according to Apple, will scan for images on people’s iCloud accounts to look for child abuse material. It has been under fire from many people in the industry because it goes against Apple’s previous privacy promises.
A breach of promised privacy?
“Once the precedent has been set that it is fit and proper for even a ‘pro-privacy’ company like Apple to make products that betray their users and owners, Apple itself will lose all control over how that precedent is applied,” writes Snowden in his newsletter on Substack.
Apple will release the CSAM features on its devices when the latest versions of its operating systems release later this year. The company is holding an event in September to unveil the next generation iPhone series, and it is expected to release the final version of iOS 15 and iPadOS 15 soon after that. Whereas the latest version of macOS which is named ‘Monterey’ is expected to release next month after a dedicated Mac hardware event.
“There is no fundamental technological limit to how far the precedent Apple is establishing can be pushed, meaning the only restraint is Apple’s all-too-flexible company policy, something governments understand all too well,” adds Snowden.
Apple’s SVP of Software Engineering Craig Federighi who is popularly known as HairForceOne because of his amazing hair, discussed the feature with WSJ’s Joanna Stern in a video. Many have requested the iPhone maker to re-consider the implementation of its CSAM features.