Reshared from www.endgadget.com
Apple has hinted it might not revive its controversial effort to scan for CSAM (child sexual abuse material) photos any time soon. MacRumorsnotes Apple has removed all mentions of the scanning feature on its Child Safety website. Visit now and you’ll only see iOS 15.2’s optional nude photo detection in Messages and intervention when people search for child exploitation terms.It’s not certain why Apple has pulled the references. We’ve asked the company for comment. This doesn’t necessarily represent a full retreat from CSAM scanning, but it at least suggests a rollout isn’t imminent.While Apple was already scanning iCloud Photos uploads for hashes of known CSAM, the change would have moved those scans to the devices themselves to ostensibly improve privacy. If iCloud Photos was enabled and enough hashes appeared in a local photo library, Apple would decrypt the relevant “safety vouchers” (included with every image) and manually review the pictures for a potential report to the National Center for Missing and Exploited Children. That, in turn, could get police involved.The CSAM detection feature drew flak from privacy advocates. Apple stressed the existence of multiple safeguards, such as a high threshold for reviews and its reliance on hashes from multiple child safety organizations rather than government. However, there were concerns the company might still produce false positives or expand scanning under pressure from authoritarian regimes. Moreover, the only way to prevent on-device scans was to avoid using iCloud Photos altogether — you had to accept Apple’s new approach or lose a valuable cloud service.Apple delayed the rollout indefinitely to “make improvements” following the criticism. However, it’s now clear the company isn’t in a rush to complete those changes, and doesn’t want to set expectations to the contrary. If local CSAM scanning reappears, it might take a long while (Apple expected a return in the “coming months”) or bear only a partial resemblance to the original system.
Source: Read More