Apple Devices to Start Scanning Images for Child Abuse

Apple Devices to Start Scanning Images for Child Abuse

( – Virtually everyone agrees with the need to put a stop to child sex abuse. The issue is something society simply cannot, and must not, tolerate. Yet, one Big Tech giant just might be taking the fight to dangerous levels.

On August 5, Apple revealed it will begin using new software to scan iPhones for images containing child abuse later in the year.

The software will scan messages of those under the age of 18, blurring out potentially harmful images. A feature will also alert parents to such concerns and notify them of anything triggering the system.

Another feature will scan the personal photos on the users’ phones and stored in iCloud. If any questionable material shows up in a scan, it will go for manual review. Once reviewers confirm that the image shows illegal content, the company will disable the person’s account and report it to law enforcement.

For a company with such a strong history of holding privacy as a basic human right, this seems incredibly concerning. Apple says that only images found by its automated system to depict child sex abuse will be subject to review by Apple employees.

However, Apple’s claims do little to soothe privacy fears among researchers and privacy watchdogs. The idea of giving up personal liberties for the greater good is getting to a point where it is frighteningly like something out of a dystopian novel. Where do we draw the line?

Copyright 2021,