An apple may have been delivered some of his plans scan the device for CSAM materials on content, but the European Commission has once again put them in the spotlight to get messaging services to start monitoring such material.

CSAM becomes a privacy test

As for child protection, that’s good. The Sexual Abuse of Children (CSAM) material is Fr. much bigger problem than many people understand; the victims of this horrific trade end up ruined lives.

What’s going on according to Euractiv, is that the European Commission plans to introduce measures that require messaging services to perform CSAM scans. However, Europe seems to understand some of the arguments against Apple’s original proposals by privacy advocates, and insists on some restrictions, including:

  • Scanning technology must be “effective”.
  • It must be “accordingly reliable”.
  • And he should avoid collecting “any other information from relevant communications other than information strictly necessary for detection.”

Of course, ensuring the “reliability” of the system is a challenge.

Just what is reliable?

When Apple announced its own view on CSAM scanning on its platforms, researchers at Imperial College London soon warned that the technology behind the system is easy to foolcalling it “not ready for deployment”.

Apple after retreated from his plansand later introduced a system to control such content in its Messages app. This does not yet apply to the analysis on the device of people’s photographs of libraries, as it was originally intended. It remains possible that it scans photos stored in iCloud, as do other image archiving companies.

Copyright © 2022 IDG Communications, Inc.

Previous articleStasi Schroeder of Vanderpump Rules Finally Held a Luxurious Dream Wedding with Bo Clark in Italy
Next articleGasoline prices return above £ 1.65 for the first time since fuel cuts