1. Skip to content
  2. Skip to main menu
  3. Skip to more DW sites

British MPs urge social media to do more to tackle terror

August 25, 2016

A report released by the Home Affairs Select Committee says major internet firms are not doing enough to stem terrorist activity. They say the websites should hire more people to control material inciting hate crimes.

Image: picture alliance/AP Photo/M. Rourke

The parliamentary committee's report asked companies like Facebook, Google and YouTube to work in close coordination with the government, police and security services to create a round-the-clock hub to monitor and immediately shut down extremist activity.

"Huge corporations like Google, Facebook and Twitter, with their billion-dollar incomes, are consciously failing to tackle this threat and passing the buck by hiding behind their supranational status, despite knowing that their sites are being used by instigators of terror," Keith Vaz, chairman of the committee, said on Thursday.

The report, which focused exclusively on radicalization among Muslims, said there was no evidence that a single path or event was responsible for triggering changes in people's behavior. However, the internet had a "huge impact in contributing to individuals turning to extremism, hate and murder," it said.

Image: ZDF

The study thought it "alarming" that only a team of few hundred employees monitored billions of internet accounts and directed the companies to work together with the London police's Counter Terrorism Internet Referral Unit by putting aside a staff that would work specifically for this purpose.

The study's recommendations were expected to be a part of a new counterterrorism legislation in the UK, called the Countering Extremism and Safeguarding Bill.

Websites respond to government's call

"In a rare instance that we identify accounts or material as terrorist, we'll also look for and remove relevant associated accounts," Simon Milner, Facebook UK's director of policy, said in a statement.

Twitter said it suspended 235,000 accounts suspected of being linked to groups like the "Islamic State" (IS) in the last six months. This was double the number it suspended from the mid-2015 to February 2016.

YouTube said its employees removed content and terminated accounts run by terror groups. "We'll continue to work with government and law enforcement authorities to explore what more can be done to tackle radicalization."

mg/kl (Reuters, AP)

Skip next section Explore more
Skip next section DW's Top Story

DW's Top Story

Skip next section More stories from DW