Elon Musk’s Twitter could be on the hook for a pipeline of multimillion-dollar penalties for failing to take down illegal hate speech in Germany.
Fines could even stack up to billions if the federal government acts on the scores of cases of content moderation inaction that have already been reported to it and German courts confirm the law has been breached.
Earlier this week, the federal government announced it was instigating a procedure over suspected systemic failures under the country’s hate speech takedowns law. The law, known colloquially as NetzDG, allows for fines of up to €50 million per case.
The federal government is acting on just a handful of tweets out of hundreds that have been reported and collated in a database, per lawyer Chan-jo Jun, the founder and managing partner of the specialist IT law firm, JunIT Rechtsanwälte.
Jun is representing the antisemitism commissioner of the federal state of Baden-Württemberg, Michael Blume, who he says has been targeted by abusive and defamatory tweets that Twitter has refused to take down. Some of the abusive tweets were posted by a Twitter user who had been banned before Musk took over the platform but had his account reinstated in
Musk’s general amnesty on suspended accounts.
Late last year, the law firm went to court seeking an injunction against Twitter for failing to act on the reports to remove hate speech under the NetzDG law. The legal challenge succeeded in establishing the tweets were illegal. And it appears to have contributed to spurring the federal government into action — which, on Tuesday, said it had established “sufficient indications of failures” in Twitter’s complaint management processes to start a process that could result in the first penalty for a social media firm for failing to remove illegal content under NetzDG.
Discussing the background to the case, Jun told TechCrunch his firm had reported a number of tweets to the Federal Justice Office (BfJ) last year but were initially told it did not have enough material to establish there was a systemic failure.
“We had reported a number of cases to the [BfJ] at that time, and found that they agreed that these tweets were illegal but said they do not have enough material for a systematic failure. And that’s when a group of volunteers started to systematically search for illegal content and keep reporting that and making a huge database… and they kept submitting that to the [BfJ]. So it’s over 600 cases,” he said.
“The ones that are now subject to the [federal government’s] case appear to be just the first ones. They picked them out because they were all similar in that way — I think they came from the same user and had the same content. That’s probably why they chose those because it would be the easiest case to see that is systematical failure. That it was not a single failure of one content moderator but actually that the vast majority — or all — of reports were wrongly handled.”
Late last year, The
New York Times reported on research by the Center for Countering Digital Hate, the Anti-Defamation League and a number of other groups that study online platforms which found major increases in hate speech since Musk took over the platform at the end of October and set about slashing headcount — including cutting staff in Germany and other international offices.
Concerns over Musk’s impact on Twitter’s content moderation in the region has also drawn shots from the EU’s executive, the European Commission, which will be taking up a major oversight role of larger platforms under the Digital Services Act later this year, which will set rules for how services must response to reports of illegal content.
Internal market commissioner, Thierry Breton,
warned Musk last November the company has huge work ahead of it to comply with the incoming pan-EU rulebook — under which penalties for breaches can scale up to 6% of global annual turnover. So if Twitter isn’t bankrupt yet, under its erratic and heavily leveraged billionaire owner, it’s facing a cripplingly costly future if Musk keeps thumbing his nose at regulators and ignoring laws he doesn’t like.
Even just in Germany if the BfJ were to act on the 600+ illegal hate speech cases that have already been reported to it that could sum to fines of up to €30 billion for Twitter, based on the maximum penalties set out in the NetzDG law.
Of course a theoretical maximum outcome is unlikely. But there’s clearly no shortage of cases the BfJ could enforce — meaning fines for Musk-owned Twitter’s failures to purge hate speech could nonetheless quickly stack up. And Musk doesn’t have a limitless supply of
legacy office furniture to sell to service his debts.
Jun says he expects the first penalty on the tweets the BfJ has taken action on to be set below the maximum — assuming the court confirms they are illegal.
“The law expects fines of up to €50 million for each case. It is possible that at first they will not take the full amount. There’s actually a table… that states the intensity of the failure. So I would expect something between €5M and €20M to be the first fine,” he suggests.
“It will take a little longer time for it to go through the entire procedure because now it is up to the courts to decide if the contents are illegal or not. I’m pretty sure however, they are illegal because they have been subject to court decisions already.”
“I had been putting pressure on the BfJ and the minister for the last seven months,” he adds. “And actually because I thought that the systematical failure had been obvious — with all kinds of crimes committed on Twitter, including child pornography, which also systematically was not taken down in the past. And the same is with most defamation cases — especially those where the legitimacy of a tweet cannot be judged by only the content of the tweet itself, where you have to explore what the truth actually is when defamation is being spread on Twitter.
“And what we saw in the past is that Twitter decided not to take the effort of asking users to provide any proof for what they stay… even though this is being done by other social networks, such as Google, or [other social] networks in Germany.”
No one would suggest that Twitter, pre-Musk, was doing a perfect job of content moderation. Far from it. And it remains curious that Germany has not pursued any social media firms for failing to comply with NetzDG’s content takedowns requirements up to now (after all,
the law has been in force since 2017).
But if Musk has done anything fast it’s scorch trust with regulators and lawmakers — by doing things like gutting resources for content moderation and sacking policy staff whose job it was to engage in dialogue with regulators over contested speech issues in order to press the company’s case — which means the institutions he loves to mock have little recourse but to assume the worst and just get on with applying the law.