Alarming: Apple Reveals New Plan To Upload Software To iPhones That Scans User’s Photos

OPINION | This article contains the author's opinion.

Privacy watchdog groups are sounding the alarm on the latest move by Apple.

Apple revealed that the company will be uploading software to user’s iPhones that scans for images of child sex abuse.

However, watchdog groups warn that this is opening “Pandora’s box.”

This allegedly creates a “backdoor to user’s private lives” and governments or private companies could abuse the power to violate users’ privacy.

Daily Wire explains:

“Apple intends to install software on American iPhones to scan for child abuse imagery, according to people briefed on its plans, raising alarm among security researchers who warn that it could open the door to surveillance of millions of people’s personal devices,” the Financial Times reported. “The automated system would proactively alert a team of human reviewers if it believes illegal imagery is detected, who would then contact law enforcement if the material can be verified.”

The report noted that the system will be called “neuralMatch” and will only be initially rolled out in the U.S. with Apple adding in a blog post that the software will “evolve and expand over time”. The software is expected to be included in iOS 15, which is set to be released next month.

The company claimed that the software provides “significant privacy benefits over existing techniques since Apple only learns about users’ photos if they have a collection of known CSAM in their iCloud Photos account.”

However, despite Apple’s claims, academics and privacy watchdogs are deeply concerned about what the move signals long-term.