Locations we serve
Locations we serve
Locations we serve
Divorce
Divorce
Divorce
Other Services
Services
Services
020 7404 9390
Available 24 hours
BOOK CONSULTATION WHATSAPP US MESSAGE US PHONE US

What does the Online Safety Bill include?

George Salmon

During yesterdays Queens Speech, the government revealed a draft version of the Online Safety Bill, a landmark attempt to try to prevent harmful content being shared online.

The bill is the most recent stage of the governments response to the Online Harms White Paper first published in April 2019, which aimed to usher in a new age of accountability for tech companies, ensuring that this accountability is commensurate with the role they play in our daily lives.

What does the Bill include?

The key element of the bill concerns the responsibility social media platforms have over the content they host on their sites. In order to prevent the spread of harmful content online, the Bill will introduce a duty of care for all social media sites, websites, apps and other sites hosting user-generated content.

Under this new regime, all such sites will have to consider the ways their platforms can be used to disseminate abusive or harmful content and if such content appears, take robust action to remove it.

Through a later addition to the bill, the definition of harmful content has been expanded. Alongside monitoring child sexual abuse, terrorist material and suicide content, the largest social media sites, referred to as Category 1 services, will also have to act on content that is lawful but still harmful. While ambiguous, this is designed to widen their responsibility, ensuring abuse that falls below the threshold of a criminal offence can be still tackled.

In order to enforce this new regime, Ofcom will be awarded the power to fine companies who fall below the standard of care, with fines of up to £18 million or ten per cent of annual global turnover. The regulator will also have the power to block access to non-compliant sites.

The bill also empowers Ofcom to hold individuals responsible for a lack of compliance: the regulator is set to receive reserved powers to pursue criminal action against senior managers who fail to comply with the code and do not respond to Ofcoms requests for information.

The question of free speech

Compelling online platforms to police and remove user-generated content has been met with controversy, particularly among ardent defenders of free speech. Responding to this, the bill contains a section which aims to strength freedom of expression and prevent moderation becoming overzealous.

In order to do this, affected companies will have to both regulate content and ensure safeguards are put in place for freedom of expression. As well as potentially including having human moderators make decisions on removals, the bill also compels platforms to create effective routes of appeal for content removed without good reason. Category 1 services will also shoulder the additional responsibility of publishing regular assessments of their impact on freedom of expression.

The bill also sets out categories of content it views to be so important that they should be exempt from the wider regulations. Content that is deemed democratically important, such as content promoting or opposing government policy or campaigning on a live political issue, will have special protections. Alongside taking into account the political context of any content while moderating, platforms will also be forbidden from discriminating against particular political viewpoints.

The same is true for journalistic content and the bill specifically exempts both the content of news publishers websites and the comments underneath them – a potentially controversial inclusion as the comment sections of major news sites are often sites of abuse and harassment.

Tackling online fraud

Abuse and graphic content are not the only types of post that can be harmful – scams and online fraud can be just as devastating. In order to cover this, tackling fraudulent user-generated content will also fall to online platforms, with the bill specifically highlighting the growing prevalence of romance fraud, whereby scammers pretend to be romantically interested in their target in order to steal money or personal details.

What does this mean for the future of social media?

A backdrop of widespread social media boycotts following racist abuse and growing concerns over the spread of disinformation across social media platforms have been the catalyst for this new regulation.

This is no new trend - it has been brewing for some time, with Januarys riots at the US Capitol the starkest indication of its effects. With social media becoming so ubiquitous, and crucially with millions around the globe now using it as their primary news source, regulation was inevitable.

More widely, it charts a definitive next step in the changing role of social media: moving from its original purpose as a publisher to its assumed role as an editor. While this reforms seem sweeping, how strictly they will actually be enforced is another question altogether.

As for the bill itself, the draft version will be scrutinised by the Digital, Culture, Media and Sport Select Committee, a joint committee of MPs, before it is later submitted to the House for a vote.  

Whether you are facing defamation in the press, by an individual, harassment, blackmail, mishandling of your private data, pre-publication threats, or you wish to gain top legal advice on such matters, reach out to our leading reputation and privacy team today. 

This site uses cookies. Find out more. Continued use of this site is deemed as consent.   CLOSE ✖