Apple is to delay the launch of new tools created to detect child sexual abuse material (CSAM).
It says it wants to take more time to “make improvements” after concerns relating to privacy were raised.
The tech company had announced plans to introduce new systems which could detect CSAM if a user tried to upload it to the iCloud and report it to the authorities.
Apple said this process would be done in a secure fashion, and would not regularly scan a user’s camera roll.
However, it has received some criticism from privacy campaigners, with some suggesting the technology could be hijacked by authoritarian governments to look for other types of imagery – something Apple has stated it would not allow.
But the tech giant has now confirmed it is delaying the rollout following feedback from a number of groups.
In a statement the company said: “Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of child sexual abuse material.
“Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.”
How does the system work?
The way the system works is by looking for image matches based on a database of “hashes” – a type of digital fingerprint – of known CASM images provided by child safety organisations.
This takes place securely on a device when a user attempts to upload images to their photo library on iCloud.
Another feature to guard against this problem is incorporated in the Messages app, which warns children and their parents using linked family accounts when sexually explicit photos are sent or received,
These images are blocked from view and are given on-screen alerts. Additionally new guidance in Siri and Search will point users to helpful resources when they perform searches related to CSAM.
Apple said the two features are not the same and do not use the same technology, adding that it will “never” gain access to communications as a result of the improvements to Messages.
Comments: Our rules
We want our comments to be a lively and valuable part of our community - a place where readers can debate and engage with the most important local issues. The ability to comment on our stories is a privilege, not a right, however, and that privilege may be withdrawn if it is abused or misused.
Please report any comments that break our rules.
Read the rules here