Search Comment Central

There's a major loophole in the Online Safety Bill

Ukrainian President Volodymyr Zelenskyy recently appeared in a video calling on Ukrainian citizens to lay down their arms and welcome the friendly Russians who were trying to liberate their country. 

Except he didn’t, of course: it was a Russian-sponsored deepfake disinformation campaign to undermine the Ukrainian defence of their homes and democracy.

There are lots of other examples, from tragic and hurtful interference into the recent disappearance of Nicola Bulley, to anti-vax campaigners, right up to attempts to influence democratic elections. 

The Government’s new Integrated Review of our security, defence and foreign policy included £20 million in funding to the BBC World Service to combat disinformation in countries targeted by hostile states. And while telling lies is as old as speech itself, social media and online memes have created a whole new frontier of opportunities for cranks and despots to spread nonsense. 

The old saying that a lie can be halfway around the world before the truth has got its boots has never been more true.

The influence of online ‘news’ has grown fast. Facebook and Instagram are the most important news sources for 16- to 24-year-olds, so it really matters if what they’re being told isn’t true. 

Research by Polis Analysis shows 88% of us want the Government to clean up online misinformation and disinformation.

So the Online Safety Bill ought to be extremely welcome, as it aims to do precisely that. But amongst all the vital and sensible changes which it aims to introduce to protect children from online harm, and prevent online scams and fraud, there’s nothing about stopping misinformation at all. It’s a big and important loophole, and there are two possible ways to plug it.

The first is to make sure that online facts are accurate. 

We need the giant social media firms to show whether the content they’re showing us is a deepfake or not, so we’ve got the tools we need to tell if someone is trying to spin us a yarn or not. And we ought to be able to set our online filters to exclude stuff that’s been doctored, if we don’t want to be shown lies in the first place too. 

It would be quicker and more reliable than trying to get the social media giants to take down factually inaccurate posts after they’ve already gone viral, and it would boost reputable journalists and news reporters who’ve taken the trouble to fact check their stories properly before they get posted online.

The second is to expect the social media giants to behave like radio or TV broadcasters, whose news programmes have to be balanced to show both sides of an argument. 

It’s a long-established standard that has worked pretty well for decades, ensuring (mostly) fair play, free speech and journalistic integrity. Asking the online social media giants to digitise it for each user’s personal filter bubble would make online radicalisation less likely, and echo-chambers less toxic too.

And that’s it. These two changes ought to clean up the online world, so what we read and see online is just as good (or bad) as in traditional books and media. 

The Government has acknowledged that mis- and disinformation are posing a serious threat to our personal and geopolitical safety, both actual and online. Now is the time for action.

John Penrose MP writes alongside Thomas Barton, the Founder and CEO of Polis Analysis.

Portrait 2023 06 01 161416 schb

John Penrose is the former Conservative MP for Weston-super-Mare.

What to read next
Shutterstock 2423533495
Today’s internet is dominated by Big Tech. Once creative and vibrant...
Todd
Todd Davies
September 5, 2024
Pexels pixabay 325944
Last month we saw a flurry of manifesto announcements from political...
Chris Turner
Chris Turner
July 11, 2024
Image: Pexels / Lukas
To fully equip the next generation to embrace these opportunities of...
Prof Rachid Hourizi 2
Prof Rachid Hourizi
February 29, 2024