The Parliament Magazine: After the vote on the amendments and the adoption of the report in plenary, are you convinced that all expectations have been met?
Christel Schaldemos: I am very happy with the result. After all, it is not uncommon for a report submitted to the plenary to be slightly modified. And that makes sense, because if we always knew the outcome would be exactly as proposed, we wouldn’t have to report back to the plenary.
Today, the whole parliament supports it, with a very strong majority. So, I am really very happy, and we have a chance to improve the legislation when we negotiate with the Council, from Monday (31 January).
PM: What do you expect from the Council? They already agreed on their mandate in November, where do you see the biggest differences emerging between your position and theirs?
CS: It’s a good question, actually. We always have discussions on the application, because it is the main interest of the Council, because the Member States must achieve this. And then they will likely address our ban on targeted advertising to children due to consumer protection and compensation concerns.
They had agreed on their negotiating mandate based on the European Commission’s proposal and it did not include a ban on targeted advertising.
The Board tends to say “well, your additions are not what we discussed” and sometimes declare themselves unwilling to negotiate on those bases. I expect them to try, but we’ve brought in a lot of additional aspects, and we believe they’re all important.
Compensating consumers for violating targeted advertising restrictions, for example, will improve online safety and trust. So, I expect we will have a tough fight on this, but I’m also confident that we can get a good result.
“What we are proposing is that these platforms need to verify the effects of their algorithm, and if there is evidence that there is a negative impact, for example on mental health, then they will have to mitigate that and change the algorithm”
PM: One aspect to which Parliament has given more attention is the transparency of the use of algorithms. I suspect this may have been partly prompted by Facebook whistleblower Frances Haugen’s testimony in Parliament, when she pointed out that regulating for this doesn’t work very well because the process of legislating takes time and the engineers behind the use of algorithms working for the big platforms will always be ahead. Do you think your proposals take that into account?
CS: We tried to find ways to address Frances Haugen’s concerns about this. Instead of regulating what they are not allowed to do on algorithms, we are saying that very large online platforms must carry out risk assessments, not only looking at illegal content, but also other types that could be detrimental to the user.
One example Haugen mentioned was how girls seeking help with nutritional issues were directed by Instagram to recommendations on self-harm and anorexia. So what we’re suggesting is that these platforms need to check the effects of their algorithm, and if there’s evidence that there’s a negative impact, for example on mental health, then they need to mitigate that and change the algorithm.
In other words, we imposed the obligation on the platforms. They can’t just wait for us to do something by regulating in three years, but they themselves have to check the consequences.
Of course, the Commission, accredited researchers and NGOs will also have access to this process, so that we are in effect opening the so-called ‘black box’ of the use of algorithms, in real time and any time, whenever the algorithm is changed.
We have compiled a long list of issues for platforms to consider, including health, fundamental rights and effects on democracy. I believe this will work and change the negative sides of social media platforms for the better.
“There’s no doubt that the Big Tech lobby will keep trying because they want to decide for themselves what they can do. But we as politicians have an obligation to write the book. rules and to decide in a democratic way, to regain control of the Internet”
PM: Some observers predict that the big platforms will use the so-called “trade secrets” argument to circumvent the opening of their algorithm “black box”. How sure are you that it won’t sabotage your efforts?
CS: Very confident. They do not have to open it up to their competitors but to the Commission, to researchers and to approved NGOs. So, in my opinion, there should be no discussion of trade secrets. Platforms must ensure that their algorithms do no harm to society, including showing the parameters of their recommendation systems, but they are not required to reveal their algorithms.
PM: EU Internal Market Commissioner Thierry Breton did much of what he described as pushing back against Big Tech’s lobbying efforts, including in a fun clip he posted on social media from the showdown scene in “The Good, the Bad and the Ugly”, with the “good” DSA as Clint Eastwood confronting the “bad” hate speech and misinformation, and the “ugly” lobbying of Big Tech. He is not not too optimistic about it, is it?
CS: No doubt the Big Tech lobby will keep trying because they want to decide for themselves what they can do. But we, as politicians, have an obligation to write the rule book and decide democratically, to take back control of the internet.
Will companies comply? We don’t know yet, but if they don’t, they risk being hit with a pretty hefty fine and facing compensation claims.
Some of them may try to keep making easy money and bend the rules, but I think the rules will be good enough to tackle that and companies will comply with them even if they do not like it. Their immense lobbying efforts have also shown us the magnitude of the stakes for all stakeholders.