London, April 9 (IANS) The British government’s “online harms” white paper that is open for public consultation till July 1, aims to make online platforms liable to protect their users, especially children.
“The prevalence of the most serious illegal content and activity, which threatens our national security or the physical safety of children, is unacceptable,” said the joint proposal from the Department for Digital, Culture, Media and Sport (DCMS) and Home Office that came out on Monday.
“Online platforms can be a tool for abuse and bullying, and they can be used to undermine our democratic values and debate.
“The impact of harmful content and activity can be particularly damaging for children, and there are growing concerns about the potential impact on their mental health and well-being,” the proposals read.
“There are also examples of terrorists broadcasting attacks live on social media. Child sex offenders use the Internet to view and share child sexual abuse material, groom children online, and even live stream the sexual abuse of children,” said the white paper.
The white paper proposes the mandatory “duty of care” on social media platforms to take reasonable steps to protect their users from a range of harms.
“Social media platforms use algorithms which can lead to ‘echo chambers’ or ‘filter bubbles’, where a user is presented with only one type of content instead of seeing a range of voices and opinions.
“This can promote disinformation by ensuring that users do not see rebuttals or other sources that may disagree and can also mean that users perceive a story to be far more widely believed than it really is,” said the paper.
Rival criminal gangs use social media to promote gang culture and incite violence.
“This, alongside the illegal sale of weapons to young people online, is a contributing factor to senseless violence, such as knife crime, on British streets,” it further read.
The Internet can be used to harass, bully or intimidate, especially people in vulnerable groups or in public life. Young adults or children may be exposed to harmful content that relates, for example, to self-harm or suicide.
“These experiences can have serious psychological and emotional impact. There are also emerging challenges about designed addiction to some digital services and excessive screen time,” the British proposals noted.
This white paper sets out a programme of action to tackle content or activity that harms individual users, particularly children.
“The government will establish a new statutory duty of care to make companies take more responsibility for the safety of their users and tackle harm caused by content or activity on their services,” it said.
Compliance with this duty of care will be overseen and enforced by an independent regulator.
In a statement, Facebook’s UK Public Policy Chief Rebecca Stimson said: “While we’ve tripled the team working to identify harmful content and protect people to 30,000 and invested heavily in technology to help prevent abuse of our platform, we know there is much more to do.
“New rules for the Internet should protect society from harm while also supporting innovation, the digital economy and freedom of speech.”