icon bookmark-bicon bookmarkicon cameraicon checkicon chevron downicon chevron lefticon chevron righticon chevron upicon closeicon v-compressicon downloadicon editicon v-expandicon fbicon fileicon filtericon flag ruicon full chevron downicon full chevron lefticon full chevron righticon full chevron upicon gpicon insicon mailicon moveicon-musicicon mutedicon nomutedicon okicon v-pauseicon v-playicon searchicon shareicon sign inicon sign upicon stepbackicon stepforicon swipe downicon tagicon tagsicon tgicon trashicon twicon vkicon yticon wticon fm
16 Dec, 2021 09:34

Apple memory-holes pedo-busting iPhone photo-scanning plan

Apple memory-holes pedo-busting iPhone photo-scanning plan

Tech giant Apple has quietly erased references to its plan to scan user files for images of child sexual abuse on all US iPhones, but didn’t shelve the idea itself. Critics call the proposed technology a major invasion of privacy.

The latest update of Apple’s Child Safety page has erased all mentions of the Child Sexual Abuse Material (CSAM) scheme, which triggered backlash after it was first announced in August.

Under CSAM, Apple devices would collect hash sums of private images and compare them to a database of images of sexual abuse of minors. It would allow the company to refer identified possessors of child pornography to the authorities, it said.

The suggestion was criticized by many advocates of online privacy, like NSA whistleblower Edward Snowden and the Electronic Frontier Foundation. They said the approach essentially gives Apple more power for surveillance of its clients and warned that if CSAM is introduced, governments would be in a position to strong-arm the US tech giants to monitor people for more materials that they deem dangerous or undesirable.

The page update was first spotted by MacRumors.com and happened sometime between last Friday and Monday, judging by archived versions.

Despite the change, the company still plans to roll out the feature sometime in the future, a spokesperson for the company told The Verge. Apple’s position on CSAM has not changed since September, when it said it was delaying the launch of the feature, Shane Bauer said. A detailed explanation of how it would work, which was published in August, is still live on Apple’s website, but no release date has been announced so far.

The page update apparently came with the introduction of two other Apple child protection features last week. One scans images in the ‘Messages’ chat for possible nudity, blurring those received by a minor and stopping those he or she wants to send, and asking for confirmation. Another one intervenes when a minor sends a search query for a topic that Apple believes to be related to child exploitation. Both were announced in August alongside CSAM.

Podcasts
0:00
25:44
0:00
27:19