The European Data Protection Board (EDPB) and European Data Protection Supervisor (EDPS) have published an opinion about proposed child safety measures that they claim could lead to “indiscriminate scanning of the content of virtually all types of electronic communications of all users”.
The European Data Protection Board (EDPB) and European Data Protection Supervisor (EDPS) have published an opinion about proposed child safety measures that they claim could lead to “indiscriminate scanning of the content of virtually all types of electronic communications of all users”.
The opinion critiques a proposed EU regulation that would allow legal authorities to force communications providers to scan messages in an effort to detect child abuse.
The EDPB and EDPS, two bodies that regulate data protection and privacy across the EU member states and EU institutions respectively, heavily criticise the proposal, which they say is “intrusive”, “disproportionate”, and risks “undermining the respect for the fundamental rights of European citizens”.
But some argue that the opinion should have gone even further.
“We agree fully with the opinion, of course,” said Kaspar Rosager Ludvigsen, Research Associate at the University of Strathclyde. “But we would go further and ask for the Commission to call it off in this manner entirely, as it is not fit for purpose.”
“We could have gone towards a regulation which sets down requirements for social worker-based systems and preventive measures in member states, but instead we get this technosolutionistic solution.”
What Is in the Proposal?
The EDPB and EDPS opinion concerns a proposal from the European Commission titled “Proposal for a Regulation of the European Parliament and of the Council laying down rules to prevent and combat child sexual abuse”.
The Commission’s proposals, published in May, aim to tackle online grooming and the sharing of child sexual abuse material (CSAM) by:
-
Establishing a new independent EU Centre on Sexual Abuse (EU Centre)
-
Requiring communications providers to undertake risk assessments and risk mitigation measures
-
Requiring EU member states to enable legal authorities to review risk assessments and issue “detection orders” in certain circumstances
-
Setting out safeguards that must be in place when communications providers are attempting to detect child sexual abuse
-
Requiring communications providers to report child sexual abuse to the EU Centre
-
Empowering national authorities to issue “removal orders” to take down CSAM
-
Requiring app stores to prevent children from downloading apps that could expose them to “a high risk of solicitation”
-
Providing oversight and redress mechanisms
In practice, this would require certain service providers, on receipt of a “detection order”, to scan the content of their users’ messages and refer those suspected of sending or receiving CSAM to the authorities.
Scanning of Content
Under Article 10(1) of the proposed regulation, providers that receive a detection order from a legal authority must “execute it by installing and operating technologies to detect the dissemination of known or new child sexual abuse material or the solicitation of children”.
The proposal would enable providers to acquire technologies designed for this purpose from the proposed EU Centre on Sexual Abuse.
This section of the proposals has campaigners, academics and the EDPB and EDPB particularly worried about privacy infringements.
“The proposed law contains measures which seem to—at a very minimum—allow the widespread scanning of people’s private communications,” says a post by the campaign group European Digital Rights (EDRi).
“Furthermore, the proposal will discourage the use of end-to-end encryption and heavily incentivise providers to take the most intrusive measures possible in order to avoid legal consequences,” the post continues.
Possible techniques for detecting content could include client-side-scanning (CSS), which aims to detect the presence of prohibited content on a user’s device rather than intercepting it during transmission.
Rosager Ludvigen, along with fellow authors Shishir Nagaraja and Angela Daly, argues in an arXiv preprint that CSS “should be limited or not used” because it can give rise to “possible human rights violations”.
Conflicts with the Charter of Fundamental Rights
The EDPB and EDPS argue that the Commission’s plan fundamentally undermines the EU’s Charter of Fundamental Rights (“the Charter”), mainly Article 7 (the right to respect for private and family life and Article 8 (the right to protection of personal data).
The opinion acknowledges that “the rights enshrined in Articles 7 and 8 of the Charter are not absolute rights, but must be considered in relation to their function in society.”
The opinion further notes that child sexual abuse is “a particularly serious and heinous crime” and that combatting such crime is an “objective of general interest” to the EU.
However, where measures limit fundamental rights, they must only do so in a way that complies with Article 52(1) of the Charter. This provision requires that any limitations on rights must:
-
Be provided for by law
-
Respect the essence of rights and freedoms
-
Comply with the principles of necessity and proportionality
-
Genuinely meet the objectives for which they are necessary
The opinion suggests that several areas of the proposal would fail to meet the above test.
For example, regarding the principle of proportionality, the opinion notes that detection orders would be “applied to an entire service and not just to selected communications” for long durations, and thus lead to “general and indiscriminate” monitoring of people’s communications.
“…in practice, the Proposal could become the basis for de facto generalised and indiscriminate scanning of the content of virtually all types of electronic communications of all users in the EU/EEA,” the opinion states.
Conflicts with the ePrivacy Directive
The EDPB and EDPS argue that the proposal would contradict the current privacy safeguards set out in the ePrivacy Directive.
The proposal would require EU member states to enable severe curtailment of people’s privacy rights—rights protected under the ePrivacy Directive.
But the proposal cites Article 15(1) of the directive, which allows EU member states to restrict people’s rights to confidentiality of communications and traffic data if such a restriction is a “necessary, appropriate and proportionate measure within a democratic society.”
The EDPD and EDPB disagree that the measures set out in the proposal are “necessary, appropriate and proportionate”.
“…the CJEU has made it clear that Article 15(1) of the ePrivacy Directive is to be interpreted strictly,” the opinion states, “meaning that the exception to the principle of confidentiality of communications… must remain an exception and must not become the rule.”
“…the Proposal would entail that the interference with confidentiality of communications may in fact become the rule rather than remain the exception”.
Conflicts with the GDPR
The EDPB and EDPS also argue that the proposal could violate certain provisions of the General Data Protection Regulation (GDPR).
Under the GDPR, processing personal data requires one of six “legal bases”. The proposal suggests that the appropriate legal basis for detecting child grooming and CSAM would be Article 6(1)(c) of the GDPR, known as “legal obligation”.
The EDPD and EDPB “welcome” this attempt to “eliminate legal uncertainty” that existed under earlier legislation relating to the detection of CSAM.
However, the opinion notes that any legal basis for processing under the GDPR must comply with Article 52(1) of the Charter—which, as explored above, is arguably violated by the proposal.
Effectiveness of the Proposal
The EDPB and EDPS also assert that the technical means proposed to detect child sexual abuse would likely be less effective than envisaged in the proposal. This, too, could render the plans incompatible with EU law.
In a section of the opinion titled “Effectiveness of the detection”, the EDPB and EDPS argue that “there seems to be no technological solution to detect CSAM that is shared in an encrypted form.
“Therefore, any detection activity—even client-side scanning intended to circumvent end-to-end encryption offered by the provider—can be easily circumvented by encrypting the content with the help of a separate application prior to sending it or uploading it.”
For example, the EDPB and EDPS note that the technologies currently available for detecting CSAM are prone to “relatively high error rates” and could result in many people being wrongly reported for having “potential” CSAM on their device.
Strathclyde University’s Rosager Ludvigsen argues that the entire approach of the proposal could be ineffective in combatting child abuse.
“I teach cybercrime which includes issues with CSAM, and one of the main points from both criminologists and everyone else working with preventing CSA in practice, is that limiting the spread of CSAM is just cutting the tip of the iceberg off constantly,” he said.
“It is symptom treatment, not disease prevention or healing.”
No comments yet