By Chris King • 05 September 2021 • 19:46
Apple delays release of controversial child pornography protection tools.
image: apple inc
APPLE has announced that it intends to delay the release of its controversial new child pornography protection tools for iPhones and iPads
Apple announced last Friday, September 3, that it will delay the launch of its controversial new child pornography protection tools, which have already been criticised by some experts for undermining the privacy of its devices and services.
The Silicon Valley giant had announced last month that iPhones and iPads, at least in the United States, would soon begin to detect images containing child sexual abuse and report them as they are uploaded to its online iCloud storage system.
Digital rights associations had protested about the use of this system because they considered that it meant the adjustments in Apple’s operating systems created a potential “back door” in devices that could be exploited by governments or other organisations.
Apple actually became the standard-bearer for its customers’ privacy with the case of the terrorist in the San Bernardino massacre in California, when it refused to give the FBI the keys to the iPhone of the main suspect, eventually collaborating under the threat of being charged with a federal crime, and after the FBI turned to an Israeli company.
In a statement, Apple said, “Based on feedback from customers, advocacy groups, researchers, and others, we decided to take more time in the coming months to gather information, and make improvements before launching these important child safety features”.
According to their original description of this system, this new technology would enable software that powers Apple’s mobile devices to compare photos stored on a user’s phone against a database of child sexual abuse images provided by security organisations, and then flag the images as they are uploaded on Apple’s online iCloud storage.
The system, should it come into use, would be “powered by cryptographic technology” to determine “if there is a match without revealing the result”, unless the image is found to contain child sexual abuse.
Other companies, such as Microsoft, Facebook, and Google, already have software that helps police officers around the world to detect images and words linked to child abuse, which the European Commission authorised their use of, as reported by elperiodico.com.
Thank you for reading, and don’t forget to check The Euro Weekly News for all your up-to-date local and international news stories.
Share this story
Subscribe to our Euro Weekly News alerts to get the latest stories into your inbox!
By signing up, you will create a Euro Weekly News account if you don’t already have one. Review our
Originally from Wales, Chris spent years on the Costa del Sol before moving to the Algarve where he is a web reporter for The Euro Weekly News covering international and Spanish national news.
Got a news story you want to share? Then get in touch at [email protected]
Your email address will not be published. Required fields are marked *
Download our media pack in either English or Spanish.