Monday, 23 December 2024

About Apple’s New CSAM Detection system

Apple has developed a system called CSAM Detection, which searches users’ devices for “child sexual abuse material,” also known as CSAM.

Although “child pornography” is synonymous with CSAM, the National Center for Missing and Exploited Children (NCMEC), which helps find and rescue missing and exploited children in the United States, considers “CSAM” the more appropriate term. NCMEC provides Apple and other technology firms with information on known CSAM images.

CSAM Detection rollout timeline

CSAM Detection will be part of the iOS 15 and iPadOS 15 mobile operating systems, which will become available to users of all current iPhones and iPads (iPhone 6S, fifth-generation iPad, and later) this autumn. 

How does It work?

SAM Detection works by scanning photos on a device to determine whether they match photos in NCMEC’s or other similar organizations’ databases. The detection method uses NeuralHash technology, which in essence creates digital identifiers, or hashes, for photos based on their contents. If a hash matches one in the database of known child-exploitation images, then the image and its hash are uploaded to Apple’s servers. Apple performs another check before officially registering the image.

Another component of the system, cryptographic technology called private set intersection, encrypts the results of the CSAM Detection scan such that Apple can decrypt them only if a series of criteria are met.

The Challenges

A potential criticism of Apple’s actions falls into two categories: questioning the company’s approach and scrutinizing the protocol’s vulnerabilities. Having a device on which the data is completely encrypted (as Apple asserts) that then begins reporting to outsiders about that content.

Ultimately, that criticism is political more than technological. The problem lies in the absence of a social contract that balances security and privacy. All of us, from bureaucrats, device makers, and software developers to human-rights activists and rank-and-file users — are trying to define that balance now.

Direct Publication Source: https://www.kaspersky.com/blog/what-is-apple-csam-detection/41502/

Comments


You May Like These Too


Get Latest Updates