London, Aug 25 (IANS) Facebook, Twitter and YouTube were accused by the members of British Parliament of “consciously failing” to combat the use of social networking sites to promote terrorism and extremism.
A House of Commons report of home affairs select committee said the social media networks were becoming the “vehicle of choice in spreading propaganda and the recruiting platforms for terrorism”, the Guardian reported on Thursday.
Their failure to tackle this threat had left some parts of the internet “ungoverned, unregulated and lawless”, the committee chairman Keith Vaz.
Vaz demanded the social media networks worked much more closely with the police to immediately shut down terrorist activity online.
The report comes after the authorities last week struggled to get online posts by the convicted radical Islamist cleric Anjem Choudary deleted even after his arrest for inviting support for Islamic State militant group.
The MPs’ inquiry into tackling radicalisation recommended that the government’s “Prevent” programme be rebranded as the “Engage” programme to remove its “toxic” associations in the Muslim community.
According to the Guardian, the web organisations reacting strongly to the combative tone of the report, said they took their role seriously in combatting the spread of extremism.
Twitter last Friday said it had suspended 235,000 accounts for promoting terrorism in the past six months.
Facebook insisted it dealt “swiftly and robustly” with reports of terrorist-related content.
The US State Department and the French Interior Minister both praised Twitter for moving swiftly to try to get the IS off its platform.
But the report said the suspension of 350,000 Twitter accounts since mid last year and Google’s removal of 14 million videos in 2014 relating to all kinds of abuse were “in reality a drop in the ocean”.
“We are engaged in a war for hearts and minds in the fight against terrorism. The modern front line is the internet. Its forums, message boards and social media platforms are the lifeblood of Daesh [the Arabic for the IS] and other terrorist groups,” Vaz said.
“Huge corporations like Google, Facebook and Twitter… are consciously failing to tackle this threat and passing the buck by hiding behind their supranational legal status, despite knowing that their sites are being used by the instigators of terror.”
Vaz said it was alarming they had teams of “only a few hundred” employees to monitor billions of social media accounts and that Twitter did not even proactively report extremist content to the law enforcement agencies.
The MPs wanted British government to introduce measures that require the web companies to cooperate with police’s specialist unit by promptly investigating hate speech sites and closing them down, or providing an explanation for why they have been left online.
The unit should be upgraded to a round-the-clock “central hub” operation, the report recommended.
The committee also want the web companies to publish quarterly statistics showing how many sites and accounts they have taken down, and would like the success of the Internet Watch Foundation in tackling online child sexual abuse replicated in countering online extremism.
Responding to the report, Home Office Security Minister Ben Wallace said it was vital everyone played their part in defeating extremism.
“We are working closely with the internet companies and want to see a swifter, more automated approach to identification and removal of content from social media sites, not just in Britain but across the world,” he said.
Simon Milner, director of policy at Facebook in Britain, said: “As I made clear… terrorists are not allowed on Facebook and we deal swiftly and robustly with reports of terrorism-related content.
“In the rare instances that we identify accounts or material as terrorist, we’ll also look for and remove relevant associated accounts and content. Online extremism can only be tackled with a strong partnership between policymakers, civil society, academia and companies.
YouTube said it took its role very seriously. “We remove content that incites violence, terminate accounts run by terrorist organisations and respond to legal requests to remove content that breaks Britain’s law.”