An apple may have been delivered some of his plans scan the device for CSAM materials on content, but the European Commission has once again put them in the spotlight to get messaging services to start monitoring such material.
CSAM becomes a privacy test
As for child protection, that’s good. The Sexual Abuse of Children (CSAM) material is Fr. much bigger problem than many people understand; the victims of this horrific trade end up ruined lives.
What’s going on according to Euractiv, is that the European Commission plans to introduce measures that require messaging services to perform CSAM scans. However, Europe seems to understand some of the arguments against Apple’s original proposals by privacy advocates, and insists on some restrictions, including:
- Scanning technology must be “effective”.
- It must be “accordingly reliable”.
- And he should avoid collecting “any other information from relevant communications other than information strictly necessary for detection.”
Of course, ensuring the “reliability” of the system is a challenge.
Just what is reliable?
When Apple announced its own view on CSAM scanning on its platforms, researchers at Imperial College London soon warned that the technology behind the system is easy to foolcalling it “not ready for deployment”.
Apple after retreated from his plansand later introduced a system to control such content in its Messages app. This does not yet apply to the analysis on the device of people’s photographs of libraries, as it was originally intended. It remains possible that it scans photos stored in iCloud, as do other image archiving companies.
When it comes to Europe’s proposals, there is hope that the ban on building “appropriately reliable” systems will eventually face some burden of proof. Although these restrictions do not completely calm the minds of people like threat such technologies continue to be abused by repressive or authoritarian governmentsit is at least launching steps that can come together around understanding what people’s online privacy rights should be.
At the same time, the EC’s proposals seem to threaten the use of end-to-end encryption, which Apple continues to claim protection.
To the digital bill on privacy rights
The lack of a clear and coherent set of rights to protect online privacy is becoming increasingly critical as the world becomes more connected. At the same time, Europe also insists on regulations – for example mandatory side loading – This can undermine the privacy and security of your devices. These two directions look philosophically opposite, but it is possible that if regulators and legislators consider the complexity of these issues, they will begin to see some glimmer of light.
I think this is what Apple is working on as it seems increasingly important (even The World Economic Forum agrees) that developed an international standard for defining digital rights. And the need for these standards is growing.
Europe understands this; it is put forward a declaration on digital rights and principles for EU residents in early 2022.
When that happened, Margrethe Westager, executive vice president of the Europe for the Digital Age program, said in a statement: “We want safe technologies that work for people and respect our rights and values. Also when we are online. And we want everyone to have the opportunity to take an active part in our increasingly digital societies. This declaration gives us a clear guide to the rights and principles of the Internet world. “
What should these rights be?
Apple’s leadership has been active lobbying the framework around such rights for some time. Ever since Apple CEO Tim Cook powerful talk of digital surveillance in 2018, the company constantly and (mostly) consistently lobbied for a privacy agreement. Cook’s campaign continues to work in that direction granting such rights unilaterally, but also calls for universality in such protection. Apple claims the following four pillars:
- Users should have the right to minimize personal data.
- Users should have the right to know what data is collected about them.
- Users should have access to this data.
- Users should have the right to keep this data safe.
While we all aware that some business models will be forced to change as a result of any such set of rights, the introduction of a certain digital certainty would at least help to improve a level playing field in technology.
And the need to maintain a well-thought-out balance between individual rights and collective responsibility seems stronger today than ever before.
Copyright © 2022 IDG Communications, Inc.