Apple has revealed details of a new system to find child sexual abuse material (CSAM) on devices in the US.

Apple

Apple

Before an image is stored on iCloud Photos, the system will look for matches of already known CSAM.

If a match is found, the next step will see a human reviewer manually assess the images and report the user to law enforcement.

Apple said: "Before an image is stored in iCloud Photos, an on-device matching process is performed for that image against the known CSAM hashes."

The company confirmed new versions of iOS and iPadOS - which will launch later this year - will have "new applications of cryptography to help limit the spread of CSAM online, while designing for user privacy".

The technology compares pictures to a database of known CSAM which has been compiled by the US National Center for Missing and Exploited Children (NCMEC) and other child safety branches.

From there, the pictures will be turned into "hashes" - which are numerical codes - and they can then be "matched" to a photo, including edited but similar versions, on an Apple device.