×
ADVERTISEMENT
ADVERTISEMENT
ADVERTISEMENT

Explained | How Apple tool will scan for child abuse photos on devices

ohit KVN
Last Updated : 10 August 2021, 13:03 IST
Last Updated : 10 August 2021, 13:03 IST
Last Updated : 10 August 2021, 13:03 IST
Last Updated : 10 August 2021, 13:03 IST

Follow Us :

Comments

Last week, Apple revealed that it plans to launch a Child Safety tool that will scan through iPhones and other devices' storage to look for Child Sexual Abuse Material (CSAM) and flag it to the concerned authority such as National Center for Missing and Exploited Children (NCMEC) to take action against the alleged accused.

Initially, it is expected to start in the US. While some people and organizations have appreciated the efforts, many privacy activists and even big-tech executives such as WhatsApp CEO Will Cathcart raised concerns that this will lead to a breach of personal privacy.

Their primary concern is what if the tool gets in the hands of unethical hackers or even a government agency, they can create a backdoor to get in to an Apple device for espionage.
Now, Apple has released a document to clarify how its tool will work on-device and also protect the privacy of the owner.

Here's how Apple's child safety child tool works:
Apple says that the new tool is developed with user privacy in mind and it works only on-device and doesn't scan for photos stored in the device's storage. Also, the tool keeps the integrity of the end-to-end encryption on Apple Messages app at all times.

Apple allows parents to set up iPhone/iPad/iPod Touch for children with unique Apple ID and links them to an elder's accounts via the Family Sharing feature.

It should be noted that either parent or a guardian account must opt-in to enable communication safety in Messages app and can only choose to turn on parental notifications for child accounts age 12 and younger.

This helps Apple identify the device is used by a child and this is where the tool comes into action. Whenever an image is being sent or received on Apple Messages app, it is matched against the known CSAM hashes.

Also, when a user has turned on the iCloud photo storage, each image gets uploaded to the cloud, which will also match with CSAM hashes.

Apple CSAM tool Workflow. Credit: Apple
Apple CSAM tool Workflow. Credit: Apple

What is NeuralHash technology?
This NeuralHash technology is very critical to Apple's privacy policy with regard to the new tool to detect child abuse photos. This tech can only detect images that have been flagged and classified as child sexual abuse material by NCMEC. Apple will not be able to detect any new or previously unseen content. The computation is done using a mathematical algorithm to create digital fingerprinting of images in the form of a unique string of letters and numbers, to detect matching images stored in NCMEC's database. There is no chance of people's personal image getting flagged.

For instance, if you (an adult) had shared an intimate photo with your partner, and the latter has stored it in his photo library linked to iCloud, it will not be flagged as a criminal activity and also won't be recommended for manual review by a human operator.

Credit: Apple
Credit: Apple

"This matching process is powered by a cryptographic technology called private set intersection, which determines if there is a match without revealing the result. The device creates a cryptographic safety voucher that encodes the match result along with additional encrypted data about the image. This voucher is uploaded to iCloud Photos along with the image," says the Apple document.

The company also uses a threshold secret sharing system, which ensures the contents of the safety vouchers cannot be interpreted by Apple unless the iCloud Photos account crosses a threshold of known CSAM content.

This threshold is set to provide an extremely high level of accuracy and ensures less than a one in one trillion chance per year of incorrectly flagging a given account.

If it crosses the pre-set limit, Apple will then manually reviews the image by a human operator and if found to be a child abuse photo, the account ID of the person will be flagged and details will be forwarded to the NCMEC, who in turn hand-over the case local law enforcement agencies.

If the person feels there is a case of mistaken identity, he/she can file an appeal to reinstate their Apple account.

Also, Apple is slated to improve the Siri and Search feature on iPhone, iPad, and Mac to help users get immediate help in finding local agencies to flag child abuse cases.

Messages will warn children and their parents when receiving or sending sexually explicit photos.
Messages will warn children and their parents when receiving or sending sexually explicit photos.

For instance, users who ask Siri how they can report CSAM or child exploitation will be pointed to resources for where and how to file a report.

Apple said all the new features will be coming to compatible devices through software updates-- iOS 15, iPadOS 15, and macOS Monterey in September.

Get the latest news on new launches, gadget reviews, apps, cybersecurity, and more on personal technology only on DH Tech.

ADVERTISEMENT
Published 10 August 2021, 13:03 IST

Deccan Herald is on WhatsApp Channels| Join now for Breaking News & Editor's Picks

Follow us on :

Follow Us

ADVERTISEMENT
ADVERTISEMENT