Leaf&Core

Apple Finally Backing Down on Invasive CSAM Scanning Tools… For Now

Reading Time: 2 minutes.
A photo of a dog and a gray image that was computer generated.

NeuralHash can’t tell these images apart, leading to false positives.

Apple has postponed the release of their invasive Child Sexual Abuse Material (CSAM) scanning tools until “later this year in updates.” The company likely realized that users were pulling out of iCloud services, including storage space, backups, and iCloud photo storage, as well as the fact that many wouldn’t upgrade to iOS 15 to avoid the invasive software.

For a quick refresher, Apple’s CSAM scanning tools scan for photos that could be CSAM on your device. However, the technology has a few security flaws that mean it could be abused by a government. Furthermore, no one wants a digital spy on their own device. CSAM hash scanning, which is how these tools compare images without downloading actual images and videos of CSAM, use a “fuzzy” or blurry match. That is, multiple images can share the same hash. That means false positives on your device could have the authorities at your door. Considering how poorly police deal with technology, and how violently they treat suspected child abusers, the results could be incredibly dangerous. Apple claims there’s only a 1-in-a-trillion chance of this happening, but with billions of iOS users, each with thousands of photos, and this occurring over years, it’s bound to happen.

Worst of all, this could all be fixed by solely scanning photos that are sent through Apple’s servers, as they do currently.

Apple has decided to postpone the rollout, and may make changes, but the fight to win back privacy from Apple is not yet done.

Apple’s Evolving Efforts

From Apple’s PDF explaining their technology.

Apple’s going back to the drawing board, but they haven’t committed to putting a stop to their attempts to comb through users’ devices. Apple’s first reaction to the nearly universal negative response they received was to blame those who were concerned. Later, they released more information about the CSAM scanning tool, thinking it would comfort users. It didn’t. People didn’t want a spying device on their phones, even for a good cause.

Now Apple’s trying a different approach, delay.

“Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material. Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.”

– Apple’s statement

Here’s to hoping Apple’s delay is like the AirPower: delay for years, finally give up, never implement.

Security experts, the Electronic Frontier Foundation (EFF), and even some Apple employees spoke out about the danger of Apple’s on-device scanning tool. Finally, Apple listened. Let’s hope they don’t come back with anything like this again.


Sources/Further Reading:

 

Exit mobile version