Summary
This article outlines the Online Safety Act (OSA) 2023 and its implications for transgender individuals, including measures to address illegal content and hate speech online.
Online Safety Act (OSA) 2023 – What it means to Trans People
Introduction
Online Safety Act 2023 became law on 26 October 2023 and was needed “because of the changing landscape of online harms and the necessity to protect individuals in the UK by making the use of internet services safer”. The OSA addresses the risks of harm from illegal content and activity, and it imposes duties on providers to identify, mitigate, and manage the risks of harm from illegal content and activity.
In this article I analyse the salient aspects of the OSA that impact trans people. I quote from the various documents in the OSA and if you wish to dig deeper yourself, just follow this link to all the relevant documents at the Ofcom website (independent regulator of online services in the UK). But be warned, there are over a thousand pages of legal and regulatory guidance; not a light, bedtime read!
Analysis
From Oct 2023 onwards, Ofcom was given 18 months by the Government to consult with online providers and then produce regulations and procedures so that legal requirements of the OSA could be correctly implemented and enforced. Ofcom’s duties are extensive and include:
- Creating regulations: Ofcom is responsible for establishing new regulations to protect people from illegal harm online. It has produced Codes of Practice for providers of online services.
- Risk assessment: Ofcom must carry out risk assessments to identify and assess the risks of harm of illegal content to individuals in the UK. The findings of these risk assessments are published in a ‘Register of Risks’. In turn, service providers must produce their own risk assessments for their respective online products (user-to-user service, search services, recommender services, etc).
- Guidance: Ofcom has issued “Illegal Content Judgements Guidance” to assist service providers in complying with their duties.
- Enforcement: Ofcom has the power to take enforcement action against service providers that fail to comply with the Act. They can conduct investigations, issue notices, and impose penalties for non-compliance, up to £18 million or 10% of qualifying worldwide revenue, whichever is greater.
- Monitoring and Assessment: Ofcom is required to monitor the service’s risk assessments to identify and evaluate potential dangers to individuals in the UK.
On 16 Dec 2024 Ofcom released its first Illegal Harms Statement, enforcement guidelines, and draft Codes of Practice. These initiatives are designed to equip online service providers with the necessary understanding and tools to comply with the new regulatory landscape. Ofcom then gave all service providers 3 months, until 16th Mar 25 to complete their first illegal content risk assessments.
As of 17 March 25, providers will need to implement the safety measures outlined in the Codes of Practice or use other effective measures to protect users from illegal content and activity. Ofcom has stated it will take enforcement action if providers do not act promptly to address the risks on their services.
What Are Illegal Harms?
Illegal harms are “those arising from both illegal content and the commission and facilitation of priority offences”. Ofcom lists 130 “illegal harms” which are the subject of current UK laws, and these have been grouped into 17 broad categories of “priority offences”:
- Terrorism
- Child Sexual Exploitation and Abuse (CSEA)
- Grooming
- Child Sexual Abuse Material (CSAM)
- Hate Offences
- Harassment, Stalking, Threats, and Abuse
- Controlling or Coercive Behaviour
- Intimate Image Abuse
- Extreme Pornography
- Sexual Exploitation of Adults
- Human Trafficking
- Unlawful Immigration
- Fraud and Financial Offences
- Proceeds of Crime
- Drugs and Psychoactive Substances
- Firearms, Knives, and Other Weapons
- Encouraging or Assisting Suicide
Ofcom lists all the current laws that relate to the “priority offences”; for example, the Serious Crime Act 2015, Misuse of Drugs Act 1971, Financial Services Act 2012, Fraud Act 2006, Public Order Act 1986, Terrorism Act 2000, Terrorism Act 2006, Suicide Act 1961, Firearms Act 1968, Restriction of Offensive Weapons Act 1959, Knives Act 1997, and so on.
The Impact of Hate Language on Trans People
The nature of the internet makes it a haven for people of hate to hide behind their computers to produce hateful language and content, including lies and misinformation, and to spread their message very quickly around the world.
In UK law there are five particular groups in society that are more susceptible to hate speech and campaigns. The Crown Prosecution Service (CPS)1 identifies these as race, religion, sexual orientation, disability, and of course, transgender identity.
The priority offence that most impacts trans people is “Hate Offences”, and the CPS definition of a hate offence is “Any criminal offence which is perceived by the victim or any other person, to be motivated by hostility or prejudice, based on a person’s disability or perceived disability; race or perceived race; or religion or perceived religion; or sexual orientation or perceived sexual orientation or transgender identity or perceived transgender identity.”
Hate language can have a severe impact on the individual or group being targeted, leading for example to psychological effects, including shock, anger, suicidal thoughts, shame, exhaustion, and fear, which can lead to further behavioural changes. Many trans-people will have experienced these psychological effects.
Illegal Content Judgements Guidance (ICJG)
To really understand in practical detail whether a trans person can complain to a service provider about the content of an online message, the document to read is: “Illegal Content Judgements Guidance (ICJG)2”, pages 46-71. It starts by summarising the “themes” of illegal content, then discusses them in detail. The themes are:
- Threats (including hate), encompassing:
a. threatening behaviour which is likely to cause fear or alarm
b. threatening behaviour which is likely to cause harassment or distress
c. threats which are likely to stir up racial hatred
d. threats which are likely to stir up hatred on the basis of religion or sexual orientation
e. threats which may provoke violence - Abuse and insults (including hate), encompassing:
a. abusive behaviour which is likely to cause fear or alarm
b. abusive behaviour which is likely to cause harassment or distress
c. abuse which is likely to stir up racial hatred
d. abuse which may provoke violence - Other content likely to amount to harassment (including stalking and controlling or coercive behaviour).
The key words highlighted above are harassment, alarm and distress, because these are legally identified by law, specifically the Public Order Act 1986, Sections 4a and 53.
However, UK case law has set a very high bar for identifying if a social media communication has caused a person to be harassed, alarmed or distressed. For example, if someone receives a death threat online, this would not be considered a serious or viable threat unless it is clear the perpetrator knew where the victim lived or where to find them, see CPS guidance4.
False Communications
Under Section 179 of the Online Safety Act 20235, a “false communications offence” is committed when a person “knowingly” sends a message containing information they know to be false, with the intention to cause non-trivial psychological or physical harm to the recipient, and without a reasonable excuse for sending it”; essentially making it a criminal offence to deliberately spread false information online that could cause significant harm to others.
Key points about the offence:
- Intention to cause harm: The sender must intend for the message to cause non-trivial psychological or physical harm to the likely audience.
- Knowledge of falsity: The person sending the message must know that the information it contains is false.
- No reasonable excuse: There must be no justifiable reason for sending the message.
- Likely audience: The law considers who is likely to encounter the message when determining if harm could be caused.
This is really important for trans people because arguably the worst type of online harm is caused by mis-information. For example, the use of the phrase “gender identity ideology” to promote the idea that being trans is just a political ideology, when the reality is that trans people suffer from a diagnosable neuro-divergent condition called gender dysphoria (incongruence), so no more an ideology than “autism ideology”, or “dyslexia ideology”.
Complaints Process
The OSA stipulates that online service providers must have a formal complaints process in place which efficiently handles complaints, and if the complaint is found to be valid, the content must be removed very quickly.
So, in the first instance, you should complain to the service provider. If you are not satisfied with their response, they should have an appeals process, which you should then use.
If you remain dissatisfied with the outcome, you may file a complaint with Ofcom through their online tool available at Ofcom’s Complaint Form, see:
https://ofcomlive.my.salesforce-sites.com/formentry/OSComplaintsSafetyAndComplaints
However, Ofcom cannot investigate your individual complaint, but they do provide a list of support services that can help you, including the police reporting tool, True Vision
What Ofcom will do with your complaint is to use it with other similar complaints, to gather evidence that an online service provider’s processes are not satisfactory according to the OSA and Ofcom guidelines, and they can then take general enforcement action to ensure the service provider makes changes to their processes.
Translucent Database
In order to increase the likelihood of Ofcom investigating a particular online service in relation to transphobic content, it is important to capture as many individual cases as possible. Translucent is compiling a hate crime complaints database, which it will use in dialogue with Ofcom, to highlight which online providers are in breach of the OSA and Ofcom guidelines, and to assist Ofcom in taking appropriate enforcement action.
If you have a complaint that has not been dealt with to your satisfaction and wish it to be added to the Translucent database, please forward the details to hate@translucent.org.uk.
References
- www.cps.gov.uk/crime-info/hate-crime
- www.ofcom.org.uk/siteassets/resources/documents/online-safety/information-for-industry/illegal-harms/illegal-content-judgements-guidance-icjg.pdf?v=387556
- https://www.legislation.gov.uk/ukpga/1986/64/section/4A
- https://www.cps.gov.uk/legal-guidance/communications-offences
- https://www.legislation.gov.uk/ukpga/2023/50/section/179