Child Sexual Abuse Material (CSAM) is the biggest form of child sexual exploitation in Luxembourg.  Tech “companies and service providers should be actively looking to see what is happening on their website and services,” says Noémie  Losch  from ECPAT Luxembourg.  Shutterstock.

Child Sexual Abuse Material (CSAM) is the biggest form of child sexual exploitation in Luxembourg.  Tech “companies and service providers should be actively looking to see what is happening on their website and services,” says Noémie  Losch from ECPAT Luxembourg.  Shutterstock.

Sexual child abuse has quickly become a hot topic in the past few weeks. From Virginia Giuffre's civil lawsuit against Britain's Prince Andrew in New York to expected EU legislation that will require tech companies to be more hands-on in reporting sexual child abuse on the internet, the media is buzzing with updates. ECPAT Luxembourg sheds more light on the status-quo and challenges in Luxembourg.

The link between the role of technology and child abuse has also been brought up a few times by several campaigners warning against the limitations end-to-end encryption could pose in the detection of child abuse cases. This was a feature that Facebook (now known as Meta) initially intended to roll-out across all of its applications including Facebook Messenger and on Instagram by 2022, but has now been delayed until 2023 amid raised concerns.

“At least 80 percent of all crimes leave digital footprints. Child sexual abuse #CSA alone generates tens of millions of files every year. But in a world awash with data, finding evidence can be like looking for needles in a haystack,” EU home affairs commissioner Ylva Johansson tweeted, after revealing her plans to propose in the coming months “legislation that obliges companies to identify, report and remove child sexual abuse,” in an interview with German media Welt am Sonntag.

The major change concerns the legal reporting requirements for big tech companies, who are currently not obligated to produce a mandatory report but do so on a voluntary basis, with social network and technology company Meta accounting for some 95 percent of the reports.

Luxembourg--alongside the Netherlands, France and Latvia--was cited among the in the world hosting child sexual abuse URLs in 2020 according to the most recent IWF annual report with child sexual abuse imagery traced to 153,369 URLs across 5,590 domains in 59 countries in 2020 alone, an increase of 13% compared to 2019. The Netherlands topped the list accounting for 77% of child sexual abuse URLs in this period. It has since put a hash check service at the disposal of companies which they can use to clean up their servers.

Luxembourg justice minister, Sam Tanson, has introduced a new bill that will fight against child sexual abuse and exploitation, child pornography, and amend the penal code by setting higher penalties for offenders.

In 2020, 22 million child sexual abuse reports were made by internet service providers and social media companies in the EU. Delano spoke with Noémie Losch from ECPAT Luxembourg, a non-governmental organisation on a mission to end all forms of sexual exploitation of children for commercial purposes through legal means and raise public awareness on children's rights, on what the implications of this new law might be and the scale of the issue in Luxembourg.

With all that’s been happening across the world and EU plans for new legislation making the headlines, it may seem like there’s a renewed urgency to fight child sexual abuse. Why now?

Noémie Losch: “We've had a huge increase in numbers. Especially during covid, the numbers were just exploding. I mean, the numbers of cases that are reported of child sexual abuse online is massive and Luxembourg has been the same. We’ve had a huge increase in number.  Obviously, the whole change of strategy at the European commission level with the new “digital services act” proposed by von der Leyen has [instigated] further changes in legislation.

We have legislation at the level of Luxembourg, which is obviously based on European legislations stating that companies [must] remove content, if they are aware of it.

If a company, a service provider or hosting enterprise is notified that they have child sexual abuse material (CSAM) on their services they have to remove it. But they are not actively looking for it. So, it's only when you're made aware of it. If you don't look, you won't find anything, and you can’t be held accountable because you didn't know.

Currently, companies are encouraged to be more aware about what is happening on their services, but they don't have to so [as] it's all on a voluntary basis. But we all know that if it's on a voluntary basis, obviously, the majority will not do it. So what we’re seeing now with the increase [in child sexual abuse cases] and [somewhat due] to the pandemic is that they're just becoming stricter, which is great.

We're really happy about this development, because a few months ago, the [EU] wasn’t as strict, so the commission didn't have such a strong stance on it. We’re happy because it's [also] something that we've been advocating for. If we're talking about child protection, it shouldn't be voluntary. Companies and service providers should be actively looking to see what is happening on their website and services.”

What do the numbers tells us about the actual scale of the issue in Luxembourg?

“One of the big problems we're having is [with] numbers. We don't have official numbers. Anyways, it's always really difficult to have the actual number for sexual crimes. [Another] problem is that we don't have a system that collects all the data. So what we do is that we go to individual places and try to collect the data that we can find. The numbers we do have is from KJT [operator of the Bee Secure Luxembourgish hotline], which is responsible for removing content. The bee secure stop line is where you report illegal content and then they will transmit this to the police to take action. In 2020, they had between 4000 to 22000 URLs that were being reported. And that's just the URLs. That doesn't mean it's one image [per URL]. This can vary a lot. Between 2019 to 2020, there was an increase of 32% of the material that was being reported.”

The prosecution of offenders remains still too low and quite problematic. Is this somehow part of your campaign strategies at ECPAT?

“These are really super important points. On the one hand, it is the data we don't have [in terms of] official numbers. Without official numbers, it’s really hard to advocate and say, listen, there's a problem. It's the same with the cases. The problem is that we don't have access to all the judgments, but they need to be transparent.

Of course, we're not going to know who the person is, but there's probably around 40 cases that are always happening per year, but we don't know what's happening with them. Sometimes some cases are made public and most of them are not. If you go on the justice ministry website, where you can look up the judgments, you don't find everything either because it's them deciding which ones they think are interesting enough or important enough to put up and then they decide on the keywords. So, if you look up [certain terms] you will not find every case. And it's really hard for us also when we were doing campaign research. Obviously, we didn't have any numbers to present. We don't know which sentences are pronounced. We don't know whether they are really symbolic or whether they really represent the gravity and severity of the crime.

It's something that's really a hard issue for us and something that we've been advocating [about] to make judgments public, because we need to know what's happened. Sometimes, we get numbers if a deputy writes a parliamentary question. We know that in 2019, there were around 25 cases where there was a judgment--I think it was 37 in 2018 and 44 in 2017--so, we know it's always between 20 to 45 per year at least.

But then we don't get any info and then the ones we do know are always on probation. It means that if a sentence is pronounced, it will be on probation. And then we need to argue why it's not going to be on probation. And that's really hard from a legal point of view [even with] thousands of images and hundreds of videos found on [offenders] hard disk, they only get probation or a very small fee [penalty] and it's not very representing [of the crime committed].”

How then can the urgency for people and companies to engage, react, and report be driven given the circumstances?

“Good question. I think it's really important and there are different layers. On one hand, it's about finance resources or a mechanism to collect the [actual] numbers. If this is done, already the numbers would be a bit higher. And once you have higher numbers, you can also get a political debate. You have more politicians talking about it, you have more resources spent on it. And the more resources you have that are being spent on it, the more you will find [cases].

On the other hand, as long as we don't have the numbers, what we need to do is explain what this is and [especially] de-construct the word “child pornography”, because it's not representing the crime that it really is. It's a child sexual abuse material. And what is really happening is that you have children that are being raped or going through a form of sexual violence by an adult.

Every time a person looks at CSAM, the crime is being repeated. Even if it's just one case, it's already too much. As part of grooming, a lot of the material that is online has become self-produced images [by kids]. It is really where you start telling people this is happening in Luxembourg, not somewhere far away. It’s our kids that are groomed while they're home. Everyone has their role to play, tech companies, child protection, organisations, the state, police etc.”

What are some solutions and prevention efforts for child sexual abuse content circulating online?

“Tech companies have the solutions for the tech industry. [For example] hashing is a technology that scans services and detects pictures, and then known CSAM material can be taken off. That will already decrease massively the numbers of child abuse images online. In terms of grooming, you can act before the abuse, and even some technologies can detect when someone is grooming a child [based on] certain patterns.”

Are there any major campaigns or programs ECPAT will be organising or leading this year?

“We’re going to publish our recommendations about CSAM, where a lot of the points will be summarised in a document. Also, we will be having a campaign later in the year on sexual exploitation of children in tourism and travel. Our campaigns and workshops with interested schools will continue.”

ECPAT Luxembourg is part of an international network of organisations present in over 100 countries across the world.