Online Safety Act 2023

From Wikipedia, the free encyclopedia
(Redirected from U.K. Online Safety Act)

Online Safety Act 2023
Act of Parliament
Long titleAn Act to make provision for and in connection with the regulation by OFCOM of certain internet services; for and in connection with communications offences; and for connected purposes.
Citation2023 c. 50
Introduced byMichelle Donelan, Secretary of State for Science, Innovation and Technology (Commons)
The Lord Parkinson of Whitley Bay, Parliamentary Under-Secretary of State for Arts and Heritage (Lords)
Territorial extent 
  • England and Wales
  • Scotland
  • Northern Ireland
Dates
Royal assent26 October 2023
CommencementOn royal assent and by regulations.
Status: Current legislation
History of passage through Parliament
Text of statute as originally enacted
Text of the Online Safety Act 2023 as in force today (including any amendments) within the United Kingdom, from legislation.gov.uk.

The Online Safety Act 2023[1][2][3] (c. 50) is an act of the Parliament of the United Kingdom to regulate online speech and media. It passed on 26 October 2023 and gives the relevant Secretary of State the power, subject to parliamentary approval, to designate and suppress or record a wide range of speech and media deemed "harmful".[4][5]

The act requires platforms, including end-to-end encrypted messengers, to scan for child pornography, despite warnings from experts that it is not possible to implement such a scanning mechanism without undermining users' privacy.[6]

The act creates a new duty of care of online platforms, requiring them to take action against illegal or legal but "harmful" content from their users. Platforms failing this duty would be liable to fines of up to £18 million or 10% of their annual turnover, whichever is higher. It also empowers Ofcom to block access to particular websites. It obliges large social media platforms not to remove, and to preserve access to, journalistic or "democratically important" content such as user comments on political parties and issues.

The bill that became the Act was criticised for its proposals to restrain the publication of "lawful but harmful" speech, effectively creating a new form of censorship of otherwise legal speech.[7][8][9] As a result, in November 2022, measures that were intended to force big technology platforms to take down "legal but harmful" materials were removed from the bill. Instead, tech platforms are obliged to introduce systems that will allow the users to better filter out the "harmful" content they do not want to see.[10][11]

The act grants significant powers to the Secretary of State to direct Ofcom, the media regulator, on the exercise of its functions, which includes the power to direct Ofcom as to the content of codes of practice.[vague] This has raised concerns about the government's intrusion in the regulation of speech with unconstrained emergency-like powers which could undermine Ofcom's authority and independence.

Provisions[edit]

Scope[edit]

Within the scope of the act is any "user-to-user service". This is defined as an Internet service by means of which content that is generated by a user of the service, or uploaded to or shared on the service by a user of the service, may be read, viewed, heard or otherwise experienced ("encountered") by another user, or other users. Content includes written material or messages, oral communications, photographs, videos, visual images, music and data of any description.[12]

The duty of care applies globally to services with a significant number of United Kingdom users, or which target UK users, or those which are capable of being used in the United Kingdom where there are reasonable grounds to believe that there is a material risk of significant harm.[12]

Duties[edit]

The duty of care refers to a number of specific duties to all services within scope:[12]

  • The illegal content risk assessment duty  
  • The illegal content duties
  • The duty about rights to freedom of expression and privacy
  • The duties about reporting and redress
  • The record-keeping and review duties

For services 'likely to be accessed by children', adopting the same scope as the Age Appropriate Design Code, two additional duties are imposed:[12]

  • The children's risk assessment duties
  • The duties to protect children’s online safety

For category 1 services, which will be defined in secondary legislation but are limited to the largest global platforms, there are four further new duties:[12]

  • The adults' risk assessment duties
  • The duties to protect adults’ online safety
  • The duties to protect content of democratic importance
  • The duties to protect journalistic content

Enforcement[edit]

This would empower Ofcom, the national communications regulator, to block access to particular user-to-user services or search engines from the United Kingdom,[13][14][15] including through interventions by internet access providers and app stores. The regulator will also be able to impose, through "service restriction orders", requirements on ancillary services which facilitate the provision of the regulated services. The Act lists in section 92 as examples (i) services which enable funds to be transferred, (ii) search engines which generate search results displaying or promoting content and (iii) services which facilitate the display of advertising on a regulated service (for example, an ad server or an ad network). Ofcom must apply to a court for both Access Restriction and Service Restriction Orders.[12] Section 44 of the Act also gives the Secretary of State the power to direct OFCOM to modify a draft code of practice for online safety if deemed necessary for reasons of public policy, national security or public safety. OFCOM must comply with the direction and submit a revised draft to the Secretary of State. The Secretary of State may give OFCOM further directions to modify the draft, and once satisfied, must lay the modified draft before Parliament. Additionally, the Secretary of State can remove or obscure information before laying the review statement before Parliament.[16]

Limitations[edit]

The act has provisions to impose legal requirements ensuring that content removals do not arbitrarily remove or infringe access to what it defines as journalistic content.[13] Large social networks would be required to protect "democratically important" content, such as user-submitted posts supporting or opposing particular political parties or policies.[17] The government stated that news publishers' own websites, as well as reader comments on such websites, are not within the intended scope of the law.[13][15]

Age verification for online pornography[edit]

Section 212 of the act repeals part 3 of the Digital Economy Act 2017, which demands mandatory age verification to access online pornography but was subsequently not enforced by the government.[18] The act will include within scope any pornographic site which has functionality to allow for user-to-user services, but those which do not have this functionality, or choose to remove it, would not be in scope based on the draft published by the government.[12]

Addressing the House of Commons DCMS Select Committee, the Secretary of State, Oliver Dowden, confirmed he would be happy to consider a proposal during pre-legislative scrutiny of the Act by a joint committee of both Houses of Parliament to extend the scope of the Act to all commercial pornographic websites.[19] According to the Government, the Act addresses the major concern expressed by campaigners such as the Open Rights Group[20] about the risk to user privacy with the Digital Economy Act's[21] requirement for age verification by creating, on services within scope of the legislation, "A duty to have regard to the importance of... protecting users from unwarranted infringements of privacy, when deciding on, and implementing, safety policies and procedures."[12]

In February 2022 the Digital Economy Minister, Chris Philp, announced that the Act would be amended to bring commercial pornographic websites within its scope.[22]

Legislative process and timetable[edit]

The draft bill for the act was given pre-legislative scrutiny by a joint committee of Members of the House of Commons and peers from the House of Lords. The Opposition Spokesperson, Lord Ponsonby of Shulbrede, in the House of Lords said, "My understanding is that we now have a timeline for the online harms Bill, with pre-legislative scrutiny expected immediately after the Queen’s Speech—before the Summer Recess—and that Second Reading would be expected after the Summer Recess."[23] But the Minister replying refused to pre-empt the Queen's Speech by confirming this.

In early February 2022, ministers planned to add to their existing proposal several criminal offences against those who send death threats online or deliberately share dangerous disinformation about fake cures for COVID-19. Other new offences, such as revenge porn, posts advertising people-smuggling, and messages encouraging people to commit suicide, would fall under the responsibilities of online platforms like Facebook and Twitter to tackle.[24]

In September 2023, during the third reading in the Lords, Lord Parkinson of Whitley Bay presented a Ministerial statement from the government claiming the controversial powers allowing Ofcom to break end-to-end encryption would not be used immediately.[6] Despite the government's claim the powers will not be used, the provisions pertaining to end-to-end encryption weakening were not removed from the Act and Ofcom can at any time issue notices requiring the breaking of end-to-end encryption technology. This followed statements from several tech firms, including Signal, suggesting they would withdraw from the UK market rather than weaken their encryption.

Support[edit]

The UK National Crime Agency, part of the Home Office, has said the act is necessary to protect children.[25] The NSPCC has been a prominent supporter of the Act, saying it will help protect children from abuse.[26] The Samaritans, that had made strengthening the Act one of its key campaigns "to ensure no one is left unprotected from harmful content under the new law"[27] gave the final Act its qualified support, also saying the Act fell short of the promise to make the UK the safest place to be online.[28]

Opposition[edit]

The international human rights organization Article 19 stated that they saw the Online Safety Act 2023 as a potential threat to human rights, describing it as an "extremely complex and incoherent piece of legislation".[29] The Open Rights Group described the Act as a "censor's charter".[30]

During an interview for the BBC, Rebecca MacKinnon, the vice president for global advocacy at the Wikimedia Foundation, criticised the Act, saying the threat of "harsh" new criminal penalties for tech bosses would affect "not only big corporations, but also public interest websites, such as Wikipedia".[31] In the same instance, MacKinnon argued the Act should have been based on the European Union's Digital Services Act, which reportedly included differences between centralised content moderation and community-based moderation.[31] In April 2023, both MacKinnon and the chief executive of Wikimedia UK, Lucy Crompton-Reid, announced that the WMF did not intend to apply the age-check requirements of the Act to Wikipedia users, stating that it would violate their commitment to collect minimal data about readers and contributors.[32][33] On 29 June of the same year, WMUK and the WMF officially published an open letter, asking the government and Parliament to exempt "public interest projects", including Wikipedia itself, from the Act before it entered its report stage, starting on 6 July.[34][35]

Apple Inc. criticised legal powers in the Act which threatened end-to-end encryption on messaging platforms in an official statement, describing the act as "a serious threat" to end-to-end encryption, and urging the UK government to "amend the Bill to protect strong end-to-end encryption".[36][37]

Meta Platforms has criticised the plan, saying, "We don't think people want us reading their private messages ... The overwhelming majority of Brits already rely on apps that use encryption to keep them safe from hackers, fraudsters and criminals".[25] Head of WhatsApp Will Cathcart voiced his opposition to the Act, stating that the service would not compromise its encryption for the proposed law and saying "The reality is, our users all around the world want security – ninety-eight percent of our users are outside the UK, they do not want us to lower the security of the product and just as a straightforward matter, it would be an odd choice for us to choose to lower the security of the product in a way that would affect those ninety-eight percent of users."[38][39] He also stated in a tweet that scanning everyone's messages would destroy privacy.[40]

Ciaran Martin, a former head of the UK National Cyber Security Centre, accused the government of "magical thinking" and said that scanning for child abuse content would necessarily require weakening the privacy of encrypted messages.[25]

In February 2024, the European Court of Human Rights (ECHR) ruled, in an unrelated case, that requiring degraded end-to-end encryption "cannot be regarded as necessary in a democratic society".[41]

See also[edit]

References[edit]

  1. ^ "Online Safety Act 2023".
  2. ^ Landi, Martyn (26 October 2023). "Online Safety Act becomes law in the UK". The Independent. Retrieved 27 October 2023.
  3. ^ "Online Safety Act 2023". UK Parliament. 27 October 2023. Retrieved 27 October 2023.
  4. ^ "Online Safety Bill: Beefed up internet rules become law". BBC News. 26 October 2023. Retrieved 26 October 2023.
  5. ^ Porter, Jon (26 October 2023). "The UK's controversial Online Safety Bill finally becomes law". The Verge. Retrieved 26 October 2023.
  6. ^ a b "Ministerial statement on UK's Online Safety Bill seen as steering out of encryption clash". uk.style.yahoo.com. 6 September 2023. Retrieved 6 September 2023.
  7. ^ "Tech firms could face fines over harmful content in government's new online safety bill". Sky News. Retrieved 18 May 2021.
  8. ^ "Online safety bill 'a recipe for censorship', say campaigners". The Guardian. 12 May 2021. Retrieved 18 May 2021.
  9. ^ Landi, Martyn (13 May 2021). "Online Safety Bill labelled 'state-backed censorship' by campaigners". www.standard.co.uk. Retrieved 18 May 2021.
  10. ^ "Online Safety Bill: Plan to make big tech remove harmful content axed". BBC News. 28 November 2022. Retrieved 29 November 2022.
  11. ^ Sandle, Paul (29 November 2022). "UK ditches ban on 'legal but harmful' online content in favour of free speech". Reuters. Retrieved 29 November 2022.
  12. ^ a b c d e f g h "Draft Online Safety Bill" (PDF). 12 May 2021. Retrieved 15 May 2021.
  13. ^ a b c Lomas, Natasha (12 May 2021). "UK publishes draft Online Safety Bill". TechCrunch. Retrieved 12 May 2021.
  14. ^ "Tech firms could face fines over harmful content in government's new online safety bill". Sky News. 12 May 2021. Retrieved 12 May 2021.
  15. ^ a b Wakefield, Jane (12 May 2021). "Government lays out plans to protect users online". BBC News. Retrieved 12 May 2021.
  16. ^ "Online Safety Bill (as brought from the Commons)". 18 January 2023. Archived from the original on 25 February 2023.
  17. ^ Hern, Alex (12 May 2021). "Online safety bill 'a recipe for censorship', say campaigners". The Guardian. Retrieved 12 May 2021.
  18. ^ Grant, Harriet (5 May 2021). "UK government faces action over lack of age checks on adult sites". The Guardian. Retrieved 13 May 2021.
  19. ^ "Oral evidence transcripts". UK Parliament: Committees.
  20. ^ "Digital Economy Bill Could Lead to Ashley Madison Style Data Breaches". Open Rights Group. 13 September 2016. Retrieved 15 May 2021.
  21. ^ "Digital Economy Act 2017 Part 3". gov.uk. 12 May 2021. Retrieved 15 May 2021.
  22. ^ Milmo, Dan; Waterson, Jim (8 February 2022). "Porn sites in UK will have to check ages in planned update to online safety bill". The Guardian.
  23. ^ "Domestic Abuse Bill - Wednesday 17 March 2021 - Hansard - UK Parliament". hansard.parliament.uk. Retrieved 15 May 2021.
  24. ^ Knowles, Tom; Dathan, Matt (5 February 2022). "Trolls could be jailed for online threats". The Times.
  25. ^ a b c "Braverman and Facebook clash over private message plans". BBC News. 19 September 2023. Retrieved 20 September 2023.
  26. ^ https://www.nspcc.org.uk/about-us/news-opinion/2023/2023-09-19-the-online-safety-bill-has-been-passed-in-a-momentous-day-for-children/
  27. ^ https://www.samaritans.org/news/online-safety-bill-the-ins-and-outs/
  28. ^ https://www.samaritans.org/news/samaritans-response-to-the-passing-online-safety-bill/
  29. ^ "UK: Online Safety Bill is a serious threat to human rights online". ARTICLE 19. 25 April 2022. Retrieved 1 May 2023.
  30. ^ "Online Safety Bill a threat to human rights warn campaigners". Open Rights Group. 16 November 2022. Retrieved 1 May 2023.
  31. ^ a b "Wikipedia criticises 'harsh' new Online Safety Bill plans". BBC News. 17 January 2023. Retrieved 3 July 2023.
  32. ^ "Wikipedia will not perform Online Safety Bill age checks". BBC News. 28 April 2023. Retrieved 1 May 2023.
  33. ^ Milmo, Dan (28 April 2023). "UK readers may lose access to Wikipedia amid online safety bill requirements". The Guardian. ISSN 0261-3077. Retrieved 3 July 2023.
  34. ^ Iles, Natasha (29 June 2023). "Open call by UK civil society to exempt public interest projects from the Online Safety Bill". Wikimedia UK. Retrieved 3 July 2023.
  35. ^ Black, Damien (30 June 2023). "Wikimedia launches petition against UK Online Safety Bill". Cybernews. Retrieved 3 July 2023.
  36. ^ Messenger, Alexander (28 June 2023). "Apple calls UK's Online Safety Bill a "serious threat" to end-to-end encryption". 24zero. Retrieved 29 June 2023.
  37. ^ Ryan-Mosley, Tate (16 October 2023). "The fight over the future of encryption, explained". MIT Technology Review. Retrieved 22 October 2023.
  38. ^ "WhatsApp: Rather be blocked in UK than weaken security". BBC News. 9 March 2023. Retrieved 3 July 2023.
  39. ^ Messenger, Alexander (10 March 2023). "WhatsApp would leave UK, rather than compromise encryption". 24zero. Retrieved 29 June 2023.
  40. ^ Woollacott, Emma. "U.K. Passes Online Safety Bill Restricting Social Media Content". Forbes. Retrieved 27 October 2023.
  41. ^ Claburn, Thomas (15 February 2024). "European Court of Human Rights declares backdoored encryption is illegal". The Register. Retrieved 18 February 2024.

External links[edit]