Apple scrubs its support pages of all mentions of its controversial CSAM image scanning feature

Apple scrubs its support pages of all mentions of its controversial CSAM image scanning feature

by Tech News
0 comment 12 views
A+A-
Reset

A hot potato: Apple’s controversial CSAM (child sexual abuse material) scan appears to have been canned. The company quietly cleansed its child safety support pages of all mention of the formerly upcoming iOS feature. The functionality was already on indefinite hold, so the content removal could mean that it entirely canceled the project. Apple has not commented on the situation.

Apple first announced CSAM scanning in early August, and it immediately stirred up criticism from privacy advocates. Cupertino engineers were designing the system to anonymously scan devices for images containing child abuse using a hashing system. If the algorithms found enough hashes, it would escalate the pictures to human review and potentially to law enforcement.

The inherent problems with the system were readily evident. Those who were found to have CSAM would face prosecution on evidence gathered under a blatant violation of their Fourth Amendment rights. Additionally, proponents had concerns that such a system could produce false positives, at least at the machine level. Innocent users could have their photos viewed by another human without their permission if the scan returned enough hashes. There was also unease over the possibility that oppressive governments could order scanning of dissidents.

We’re winning.

Apple has announced delays to its intended phone scanning tools while it conducts more research. But the company must go further, and drop its plans to put a backdoor into its encryption entirely. https://t.co/d0N1XDnRl3

— EFF (@EFF) September 3, 2021

Apple argued at the time that people were misinterpreting how the scanning would work and promised that it would never cave to governmental demands for the system’s misuse. In a misguided attempt to address the backlash, Apple mentioned that it had already been using the CSAM algorithms on emails in iCloud for the last three years.

Instead of stemming concerns, the email scanning admission worked to stir the pot even more. Pressure grew to the point that Apple indefinitely delayed the rollout of CSAM scanning, stating that it wanted to make improvements based on “feedback from customers, advocacy groups, researchers and others.”

The Electronic Frontier Foundation applauded the postponement but said nothing short of entirely abandoning the project was enough (above). In the meantime, the support pages still contained full explanations on how the system worked until recently.

On Wednesday, MacRumors saw that the content regarding CSAM was missing from the website. Cupertino has yet to comment on the removal. Presumably, Apple has put the project on the back burner until it figures out how to implement it without raising user hackles or is canceling it altogether.

Read More

You may also like

Leave a Comment