Help The Kashmir Monitor sustain so that we continue to be editorially independent. Remember, your contributions, however small they may be, matter to us.

Attention Apple users: All photos in iCloud will be checked by child abuse detection system

Apple iCloud


iPhone users’ entire photo libraries will be checked for known child abuse images if they are stored in the online iCloud service, Apple Inc has said.

The disclosure came in a series of media briefings in which Apple is seeking to dispel alarm over its announcement last week that it will scan users’ phones, tablets and computers for millions of illegal pictures.

 

While Google, Microsoft and other technology platforms check uploaded photos or emailed attachments against a database of identifiers provided by the National Center for Missing and Exploited Children and other clearing houses, security experts faulted Apple’s plan as more invasive.

Some said they expected that governments would seek to force the iPhone maker to expand the system to peer into devices for other material. In a posting to its website on Sunday, Apple said it would fight any such attempts, which can occur in secret courts.

“We have faced demands to build and deploy government-mandated changes that degrade the privacy of users before, and have steadfastly refused those demands,” Apple wrote.

“We will continue to refuse them in the future.”

In the briefing on Monday, Apple officials said the company’s system, which will roll out this fall with the release of its iOS 15 operating system, will check existing files on a user’s device if users have those photos synched to the company’s storage servers.

Julie Cordua, chief executive of Thorn, a group that has developed technology to help law enforcement officials detect sex trafficking, said about half of child sexual abuse material is formatted as video.

Apple’s system does not check videos before they are uploaded to the company’s cloud, but the company said it plans to expand its system in unspecified ways in the future.

Apple has come under international pressure for the low numbers of its reports of abuse material compared with other providers. Some European jurisdictions are debating legislation to hold platforms more accountable for the spread of such material.