30 October 2023
Online Safety Bill: a trade-off between privacy and safety?
As champions of the much-anticipated legislation celebrate it becoming law, critics hold their breath in anticipation of potential threats to privacy, security and innovation. Alexandra Leonards reports.
On Thursday, the government’s new “zero-tolerance” approach to the protection of children online finally came to fruition in the form of the Online Safety Bill, a “world-first” piece of legislation which has been in the works for over four years.
The Bill has been through multiple iterations and can be charted back as far as the Online Harms White Paper proposed by Theresa May’s government in April 2019. Even before that, the Conservative party wanted to force through age verification for viewing adult content online as part of the Digital Economy Bill in 2016 – an aspect of the original plan which has survived through to today.
Almost a decade later, Michelle Donelan, the Secretary of State for Science, Innovation and Technology, described the long-awaited Bill’s adoption into law as a historic moment that will ensure the safety of people online “now and for generations to come”. However, many stakeholders won’t be celebrating the move over ongoing – and largely unaddressed – concerns about its controversial rules on end-to-end encryption (E2EE) along with wider implications for online surveillance.
Fears about the legislation came to a head in the spring, when Meta-owned WhatsApp joined several messaging services to pen an open letter warning the government that these encryption rules were a threat to privacy. Now that the law is official, many critics are concerned that privacy and the safety of children online will be made mutually exclusive factors in how Brits use the internet going forward.
Under the new law, social media companies will be required to tackle child sexual abuse content regardless of the technologies they use, including services that use E2EE.
E2EE is a secure communication system which encrypts data at the sender's end and decrypts it at the recipient's end using a shared encryption key. As such, no third party, including the service provider, can read the message in transit.
While many messaging services already use E2EE, some social media companies are now planning to roll out encryption soon. Meta, for example, has said that it is on track to implementing E2EE across Facebook and Instagram by the end of the year. WhatsApp, also owned by Meta, has utilised the practice since 2016, while other apps like Telegram, Signal and Proton Mail built E2EE in from the ground-up as a foundational part of their offering.
From the government’s perspective, unchecked E2EE messaging services put children in the UK at risk of being targeted and groomed online. It has argued that E2EE overrides current controls in place to keep children safe, and that introducing encryption without the necessary safety features outlined in the Bill would “blind social media companies to the sexual abuse material that is repeatedly shared on their platforms”.
Now that the Online Safety Bill has come into law, Ofcom has the power to force social media firms or messaging platforms to use accredited technology – or develop their own technology – to tackle child abuse whether on public or private channels.
Companies that fail to comply with the rules could be fined up to £18 million or 10 per cent of their global annual revenue, whichever is biggest. This means that BigTech firms like Meta or TikTok could be faced with penalties worth billions.
"When encryption is weakened, everyone’s security is threatened, this will have dire ramifications not only for people’s privacy but also for the economy."
He warns that the move could also stifle innovation in the UK because businesses won’t invest in developing new products that use encryption to avoid “running afoul of the government”.
Paul Holland, chief executive of email security software business Beyond Encryption, says that while the ambitions of the legislation are commendable, they present a major problem for encryption and private messaging services.
"Currently, there’s no way for such services to align with the proposed legislation without breaking their encryption and thereby putting the privacy of users at risk."
Will Richmond-Coggan, partner at national law firm Freeths and specialist in data and online privacy disputes, argues that the harms the Online Safety Bill is looking to prevent, while serious, are “often nebulously expressed”.
“This makes it difficult to weigh the extent of the intrusion into privacy that ought to be contemplated, in seeking to combat those harms,” he explains, adding that the dilution of protections around E2EE will be harder to adopt in a manner which appropriately safeguards privacy.
He believes that there is still more to be done to make the case for why such measures are necessary and justify the level of intrusion involved.
"Without that case being convincingly made, it will be hard to persuade the technology providers to get on board – rather than, say, deciding no longer to make their products available in the jurisdiction – or indeed to persuade users to adopt compromised technology rather than seeking to circumvent the restrictions."
It is worth noting that the government is not explicitly asking companies to stop the implementation of E2EE across their messaging services, instead it wants them to roll out sufficient safety measures to either maintain or improve the identification and prevention of child sexual abuse online.
A trade-off between privacy and security?
The Internet Society’s Callum Voge posits that the government has presented the issue as a “false trade-off” between privacy and safety, with the premise being that the UK public should agree to forfeit their privacy in exchange for the safety of children.
“The problem is that privacy and security do not stand at opposite ends of the spectrum but instead are very much interlinked,” warns the director of governmental affairs.
He says that the benefits of encryption – including ensuring that sensitive information is kept confidential and isn’t tampered with and enabling users of messaging platforms to know they are really communicating with the person they think they are – are also important to children as well as others.
"Simply put, encryption keeps children safe. When a child shares their location pin to be picked up after school, when first-time parents take photos of their toddler, or when LGBTQ+ teenagers come out – encryption empowers all of us to confidently live our lives with the knowledge that we are using a tool that increases both our privacy and security."
Tasha Gibson, product manager for online safety at education technology company RM Technology says that finding a balance between privacy and safety isn't a one-size fits all endeavour.
"The safety of children online is paramount, and this has to be taken into consideration when talking about privacy. Both privacy and safety are fundamental rights and values, so prioritising one over the other completely could lead to unintended consequences."
A balance of rights
Online child protection charity the Internet Watch Foundation (IWF) argues that when reaching a judgement, Ofcom will need to consider a balance of rights, including the right to privacy.
“The right to privacy is not an absolute right and must be balanced in line with other rights,” says Michael Tunks, head of policy and public affairs at the IWF, which this week published a shocking report warning that thousands of AI-generated child sex abuse images are threatening to “overwhelm” the internet.
The head of policy says that the organisation would urge tech companies to think about the children in child sexual abuse images and their right to privacy when taking decisions about whether to encrypt their platforms.
“We believe that there are solutions to keep children safe online in a privacy preserving manner, that so far have not been explored,” he continues. “We do not believe that the right to privacy and children’s rights to safety and security are an either or, it must be and can be possible to have both.
"We already see companies deploy technology to prevent the spread of malware, phishing and spyware or even to target advertising, so why can we not prevent the circulation of child sexual abuse material in these environments?"
Rani Govender, senior child safety online policy officer at the NSPCC agrees that the privacy and safety rights of children who have been or are being sexually abused has been “completely lost” in the increasingly polarised debate about the Bill.
"Tech companies should look to deliver a balanced settlement for their users – ensuring children are protected from sexual abuse, and the privacy of all users is maintained. This includes the privacy of children who are victims of sexual abuse whose images are shared online and without detection in end-to-end encrypted environments."
She argues that there are important safeguards in place to ensure that Ofcom’s powers for private messaging are used proportionately and with regard to user rights, including requirements to consider user privacy and freedom of expression.
The NSPCC, citing its own polling, found that the UK public overwhelmingly support measures to tackle child abuse in E2EE environments, with 73 per cent of the 1,700 adults surveyed backing the measure that gives Ofcom powers to require tech giants to use technology to identify child sex abuse in these private environments if a significant risk to children has been identified.
Ofcom is already beginning work on tackling illegal content and plans to launch a consultation process early next month. Once the consultation has been completed, there will be a phased approach to bringing the law into full force, with the majority of the Act’s rules set to commence in as little as two months.
As the owner of four of the world’s largest social media and messaging platforms looks to roll out encryption for millions of UK users on Facebook and Instagram by end of 2023, all eyes will be on the country’s latest legislation and Meta, to see whether it plays ball.