1. Skip to content
  2. Skip to main menu
  3. Skip to more DW sites
Politics

EU internet policy sparks free speech concern

Teri Schultz
September 28, 2017

The EU wants online companies to filter out illegal content, but without a standard definition of what that is or how it should be removed. Critics say the potential for unwarranted censorship sends a dangerous message.

EU Justice Commissioner Vera Jourova
Image: EU/Georges Boulougouris

Vera Jourova says she's asking internet companies to do more to stop online hate speech, and that request is not just in her capacity as the European Union's commissioner for justice. Last year, she chose to close down her own Facebook account following a nonstop stream of online abuse. "It was the highway for hatred and I am not willing to support it," Jourova told a news conference on Thursday. "I have still not [re-]opened it."

EC: Speed up the takedowns

Jourova was one of four commissioners introducing the EU's latest attempt to to find a way to regulate speech online. The move is part of an effort to weed out and punish those who cross the line from potentially uncomfortable speech to what the bloc considers verifiably illegal – when they incite violence or hatred, for example. The European Commission (EC) already concluded a voluntary "code of conduct" last year under which online giants like Facebook and Twitter agreed to try to remove illegal content flagged by users within 24 hours. The companies say they are complying with the agreement but Jourova believes it is not happening fast enough. By next year, the agreement may be made binding with sanctions attached for noncompliance.

"We cannot accept a digital Wild West, and we must act," Jourova explained. 

The new guidelines call on internet platforms to "swiftly and proactively detect, remove and prevent the re-appearance of content online," the latter task potentially through the use of "automatic tools."  Examples include incitement to terrorism, illegal hate speech, child sexual abuse material or infringement of intellectual property rights. 

"Internet platforms have a duty of care to protect ordinary users from terrorist propaganda and illegal hate speech," Security Commissioner Julian King declared. "We are counting on them to be much more proactive."

Lack of harmonization may be harmful

But some of the commission's desired actions are raising red flags elsewhere. European lawmakers from the committee on civil liberties (LIBE) were among the first to blast some of the recommendations presented Thursday. Jan Philipp Albrecht of Germany's Green party told DW the commission's goal may be good, but it's taking the wrong approach to get there. 

Albrecht says the fact Jourova shut down her Facebook account over the presence of hate speech is the "clearest signal of weakness a politician could give" about the power of government to enforce the law.  He says the commission's first and most important task is to establish a more specific EU-wide definition of what constitutes unacceptable content and what should happen when it's detected. Right now each government has its own such definition and companies are often left to their own devices to identify and delete illegal content.

Germany has already passed strict laws authorizing high fines against companies that do not promptly remove hate speech.

"If any internationally active company like Facebook or Twitter is faced with 28 different rules, 28 different interpretations [of what constitutes hate speech] and you just say 'please just do something,' they will all have their own rules," Albrecht explained in exasperation. "They will just do what they need to do in order to not be blamed but it's not the law and that's problematic."

Lawmakers: Heavy hand a bad example 

Albrecht doesn't think YouTube or Instagram or any other online company should have either such authority or responsibility. "I fear that with this in the hands of the companies they will just use the automatic means they have," Albrecht said. "And that could mean that in the end, algorithms are deciding what is to be taken down and what not ... automatic takedown is not based on rule-of-law procedures."

Marietje Schaake, a fellow lawmaker on the LIBE committee from the Netherlands, calls the approach "dangerous," even a potential model for repressive behavior by authoritarian regimes in the likes of China, Russia and Turkey. In a press release, Schaake said "the Commission should be pushing back against the trend, not embracing it. There can be no room for upload filters or ex-ante censorship in the EU."

European Digital Rights, an advocacy group devoted to free speech online, echoed Schaake's concern, saying the proposal "presents few safeguards for free speech, and little concern for dealing with content that is actually criminal."

Meanwhile, EuroISPA, the Brussels-based organization representing more than 2,300 internet service providers (ISPs), countered that its members wouldn't be comfortable with or capable of exercising such authority.  As primarily small- and medium-sized enterprises, EuroISPA wrote, "they are simply unable to properly assess the context-dependent legality of content" without clearer legal guidance.

"Without this judicial clarity, ISPs are trapped between the risk of failing to properly identify illegal content and the risk of engaging in excessive censorship, thus undermining the fundamental rights of their users," the organization said. "The overwhelming majority of citizens use the Internet for its inherently empowering characteristics. And in that context, we must ensure that structures are in place such that ISPs' efforts to remove illegal content do just that, and not more."

David Meyer, a technology writer who authors the "Connected Rights" newsletter, said the commission has no choice but do more work itself.

"It's naturally tempting for the Commission to ask the online platforms to tackle such problems in an automated way, because of the sheer amount of content that's involved, but this is effectively privatizing law enforcement," he told DW. "The platforms would devote as few resources as possible to meeting their obligations, and due process would suffer. If we are to maintain justice in the online context, we need our judicial systems to decide whether content should be taken down or not, expensive as that may be."

Meyer said judicial systems need to be equipped to handle the crush of queries and appeals as "algorithms are not legal experts and they cannot be the arbiters of the public interest."

Skip next section DW's Top Story

DW's Top Story

Skip next section More stories from DW