Online Safety Bill to become law in crackdown on harmful social media content | Politics News
The Online Safety Bill has passed its last parliamentary hurdle in the House of Lords, meaning it will finally become law after years of delay.
The flagship piece of legislation aims to regulate online content to help keep users safe, especially children, and to put the onus on social media companies to protect people from the likes of abusive messages, bullying and pornography.
The idea was conceived in a white paper in 2019 but it has been a long and rocky road to turn it into law – with delays and controversies over issues such as freedom of speech and privacy.
Technology Secretary Michelle Donelan said: “The Online Safety Bill is a game-changing piece of legislation. Today, this government is taking an enormous step forward in our mission to make the UK the safest place in the world to be online.”
The bill will require social media companies to remove illegal content quickly or prevent it from appearing in the first place, including content promoting self-harm.
Other illegal content it wants to crack down on includes selling drugs and weapons, inciting or planning terrorism, sexual exploitation, hate speech, scams, and revenge porn.
Communications regulator Ofcom will be largely responsible for enforcing the bill.
Social media bosses who fail to comply can face large fines or even jail under the crackdown.
The bill has also created new criminal offences, including cyber-flashing and the sharing of “deepfake” pornography.
There have been concerns within the Tory Party that it is simply too far-reaching, potentially to the point of threatening free speech online.
Read more:
What is the Online Safety Bill?
Why the Online Safety Bill is proving so controversial
Online Safety Bill might not be too little, but it’s certainly too late
Meanwhile, tech companies criticised proposed rules for regulating legal but harmful content, suggesting it would make them unfairly liable for material on their platforms.
The key measure was removed from the bill in an amendment last year, which said that instead of platforms removing legal but harmful content, they will have to provide adults with tools to hide certain material they do not wish to see.
This includes content that does not meet the criminal threshold but could be harmful, such as the glorification of eating disorders, misogyny and some other forms of abuse.
However the bill still tasks companies with protecting children from not just illegal content, but any material which can “cause serious trauma”.