
The Online Safety Act is a trailblazer in child protection
Ofcom, the UK’s media regulator, recently announced new landmark rules in the world’s most ambitious regimes to protect child safety online. Central to these new updates are age verification requirements, which mean platforms must take steps to determine a user’s age before they can access adult content.
Similar proposals are being reviewed or piloted across some states in the US, the European Union, and Australia. Many countries are watching the progress in the UK closely.
According to Ofcom, the verification process can be carried out using bank card details, email addresses, photo-ID, facial age estimation or mobile network checks.
Until now, the responsibility for creating and accessing content has largely been placed on users. There is no other marketplace in progressive society that operates like this: where participants have full, unfettered access to controversial material without any consideration of suitability or duty of care from the forum itself.
The Children’s Commissioner has produced a harrowing analysis of the impact this is having on children in the UK in a recent report. The most common platform for them to first be shown explicit materials is X (formerly Twitter) illustrating just how far regulation is required.
Through the Online Safety Act, sites and apps will face hefty fines or even access restriction orders if they do not verify the age of users before allowing them to access adult content.
As the Department of Science, Innovation and Technology said: "children have been left to grow up in a lawless online world for too long" and "the Online Safety Act is changing that'.
Critics have argued that the Act infringes on free speech for users. The reality is that if we knew the internet would take the form it has today, access and participation would not have been permitted in such an unregulated fashion.
Insofar as the Online Safety Act limits freedom of speech, we are protecting children’s rights to be free from harm.
Since the age verification measures were announced, the UK has also seen an uptake in the use of Virtual Product Networks (VPNs) to mask the location of the device being used. The implication is that users – adults or children – are attempting to bypass the verification requirements.
This is a clear loophole which needs amending. I would support the expansion of the scope of the Online Safety Act to include VPNs.
Over time, we can also expect to see other countries undertake similar measures to those implemented under the Act in the UK, therefore eliminating some alternative locations.
Criticisms have also been made regarding the likelihood of data breaches on applications which hold the personal information of those accessing adult content for age verification purposes.
Data storage and security is certainly a prolific policy area these days, which is why the government passed The Data Use and Access Bill last October, creating certification for digital verification technologies.
The storage of sensitive information online is not a new issue. Over 90% of British adults use online banking, giving remote access to their livelihoods. Just as discretion and security are of the utmost importance to both banks and customers, we can apply the same logic to digital verification outlets, corporate clients, and end consumers.
In the end, opposition to the Online Safety Act boils down to whether society suffers a greater risk through unregulated access to the internet for children or regulated access for adults. One glance at the Children’s Commissioner’s “Sex is kind of broken now”: children and pornography report makes the choice clear.
As the internet evolves, our protections for society – especially the vulnerable – need to be updated with it.

Connor Naismith is the Labour Member of Parliament for Crewe and Nantwich.



