By the end of next year, the Online Safety Bill is set to become law.
Politicians and online critics are hoping this bill will change the way we consume content and interact online here in the UK, with Ofcom having the power to enforce its regulations.
So, what are these regulations? Who do they pertain to, and how will they affect your experience of using the internet?
Well, first and foremost the bill is specifically targeted at tech firms. That includes major players like Facebook, Instagram, YouTube, Twitter and Snapchat which allow users to interact and post their content online.
Search engines, including Google and commercial sites, like OnlyFans, are also included in the bill.
What the bill does is require these companies to take specific steps to protect users from harmful content. If they do not fulfil this ‘duty of care' they will face substantial fines by Ofcom.
This 'duty of care' is:
Based on this, here's what the Online Safety Bill is likely to mean for you.
While a total ban on anonymous social media posts and accounts is unlikely, some action will be taken to address the fact that 72% of online abuse is from anonymous accounts.
Social media platforms are being called on by campaigners, including Clean up the Internet, to make the verification status of users highly visible. And provide features like block buttons which allows users to pre-emptively block anonymous accounts.
To the dismay of many online campaigners, there is no prosecution for big tech executives who fail to deal with algorithms that steer users towards harmful and radicalising content in the bill.
But there will be criminal sanctions for users and executives.
The bill currently contains provisions for a deferred power, after about two years, to impose criminal sanctions on executives if they do not respond to information requests from Ofcom accurately and promptly.
For users, three new criminal sanctions will be brought for the offences of:
1. Sending messages or posts that “convey a threat of serious harm”.
2. Posting misinformation – “false communications” – intended to cause non-trivial emotional, psychological or physical harm.
3. Sending posts or messages intended to cause harm without reasonable excuse.
The bill requires companies to submit details of how their services may be complicit in exposing users to harmful content, and what they have done to reduce that risk.
These risk assessments will inform codes of conduct for the platforms that Ofcom will enforce, with the threat of fines.
Despite this, many feel that the regulator needs stronger powers if it is to properly audit tech firms. Like, the power to scrutinise algorithms and demand changes to them. So, we may see changes in this area.
A permanent committee of MPs and Peers will be set up to observe and police the role of the Secretary of State and Ofcom in enforcing the bill. The body can recommend measures like the Secretary of State deploying secondary powers under the bill to advise Ofcom on how they should exercise their power.
Photo by Bermix Studio on Unsplash