The statistics are sickening. In 2020, like hackers, child abusers took advantage of the Covid-19 pandemic to have a field day on the internet:
--Four million images and videos of child abuse were reported last year (compared to 725,000 in 2019), content that was mainly hosted in Europe; 95% of it was reported by Facebook and shared via Messenger. As many as 1,500 alleged abusers tried to seduce children online to get photos, videos or live broadcasts.
--The number of complaints about such material has risen from one million in 2010 to 17 million in 2019, but the number of reports has already fallen by 51% in 2021 following changes that the GDPR brought to the ePrivacy Directive.
--When administrations started working remotely and thinning their staff, the fight became ineffective, .
--Peer-to-peer exchanges of child pornography have exploded, as has the presence of abusers on the dark web.
--The increased presence of children on social networks, online gaming sites and messaging applications due to the confinement has been accompanied by a worrying influx of abusers who have upgraded their abilities to join and mingle in discussions, such as via “Zoombombings” where strangers interrupt sessions held on the video conferencing app, including school class meetings.
--And, topping it all off, the services offered by the tech giants were deemed that preceded the vote on the exemptions. Of the technologies studied (Microsoft PhotoDNA; Microsoft Project Artemis; Facebook’s algorithms PDQ and TMK+PDQF; and Thorn’s Safer tool), only Microsoft PhotoDNA passed the assessment.
A temporary solution, an unfortunate precedent
Totally stunned by the explosion in cases, the European Union decided to grant derogations to the ePrivacy Directive for three years, which puts an end to the safeguarding of online privacy in the name of child protection.
This was the wish of . “Every second and every child counts,” said Maud de Boer-Buquicchio, President of Missing Children Europe and former UN Special Rapporteur on the sale of children, child prostitution and child pornography. “We cannot let privacy law prevail over the need to scan and take down content of abuse of children.”
The regulation adopted on Tuesday in the European Parliament was passed by 537 MEPs, including Luxembourgers (independent), (DP), Isabel Wiseler-Santos Lima (CSV) and (CSV). It provides for the a priori analysis of all private correspondence via artificial intelligence tools--word and image recognition--which will officially remain under human supervision. This is despite the fact that 72% of respondents to a survey by the ChatControl initiative, led by Pirates MEP Patrick Breyer, who is affiliated to the Green party, are against the idea.
He said 86% of alerts would be irrelevant, but would result in the “transfer” of private data to third parties and national police or judicial authorities.
One of Europe’s two data watchdogs, the European Data Protection Supervisor, expressed its doubts , stressing that privacy remains a cornerstone of fundamental rights in Europe, that measures must be taken to ensure that abuses are avoided, and that the duration of the exception (until 2025) is not proportional to the objective. In effect, the exception will end when the authorities have agreed on another arrangement.
“The proposal is so poorly written that it leaves the door open for abuse of power, for false reporting and for large-scale surveillance of all our private communications. Moreover, there is no evidence that such measures would be effective at all, and many critics believe (myself included) that such activities would simply be pushed underground, making them much harder to detect,” complained Alexander Hanff, an abuse survivor.
Even the European rapporteur of the project, Birgit Sippel, has changed her stance. She said in February: “Under no circumstances will we accept a race to the bottom. We will not stand for any attempt to undermine the current levels of protection for data protection and privacy.”
But the German Socialist MEP told the European Parliament this week: “We have fought in parliament to try and improve this proposal, and to bring it in line with existing standards. We have added an additional protection measures, better information to users, and also a limit on the data being kept for 12 months. We have worked with European authorities and also service providers, so they work closely with national data protection services to look at these technologies.”
Ineffective mass surveillance
“The result is a random sweep of communications, and this is neither a proportionate nor an effective response to crime,” replied Cornelia Ernst (the Left). “What is being achieved is something quite different: this is mass surveillance of communications, and it is illegal. Providers of messenger services such as Facebook are now mandated to scan the content and metadata of all communications for known and unknown material, resulting in vast numbers of completely unsuspicious pictures and images being affected and captured, and even people who are completely innocent are coming under general suspicion, and on a massive scale.”
EU Home Affairs Commissioner Ylva Johansson also said: “Voluntary reporting is not enough. Right now, 95% of all reports come from one single service provider. It’s time for all providers to live up to their responsibilities. To make sure that no one can use their services to spread videos or pictures of children being raped or worse, facilitate the rape of children.”
This article was originally published on Paperjam. It has been translated and edited for Delano.