Apple announced that they are adding a new feature called Apple CSAM detection in iCloud. Since the announcement, everyone is complaining and opposing the decision of adding the feature. In this article, we will understand What is Apple CSAM detection and why is everyone concerned. So, let’s get into the article.
What is CSAM?
Child Sexual Abuse Material (CSAM) has varied legal meanings in different nations. CSAM is defined as pictures or recordings depicting a kid involved in or represented as engaging in explicit sexual behavior.
What is Apple CSAM detection?
Apple CSAM detection is a feature introduced for Child Sexual Abuse Material (CSAM) detection in iCloud Photos and Videos. It will be first launched in the United States. However, Apple announced on August 6, 2021, that any expansion outside of the United States will be done on a country-by-country basis, according to local laws and regulations.
Why everyone is concerned about Apple CSAM detection?
The day Apple launched the CSAM detection feature, people started complaining and showing concerns related to privacy. The most prevalent worries has been about what may happen if other countries try to use this system for other reasons, the functionality will initially be available exclusively in the United States. Concerns have been raised by a number of noteworthy sources, including Edward Snowden and the Electronic Frontier Foundation.
All it would take to widen the narrow backdoor that Apple is building is an expansion of the machine learning parameters to look for additional types of content, or a tweak of the configuration flags to scan, not just children’s, but anyone’s accounts. That’s not a slippery slope; that’s a fully built system just waiting for external pressure to make the slightest change.~ EFF
After these concerns, Apple claimed that this is important to safeguard children and that everything is done with their privacy in mind.
Follow us on Twitter for the latest news and updates regarding Android and Google. Thanks for reading.