Navigating the UK Online Safety Act
What Fediverse Service Providers Need to Know
The Online Safety Act (OSA) is now in force in the United Kingdom. Enforced by Ofcom, the UK’s communications regulator, the Act introduces a new set of legal duties for online services. Its stated purpose is to reduce online harm, particularly exposure to illegal content and material harmful to children, while safeguarding users’ rights to express themselves freely and access legal content.
If you operate a decentralised platform, the OSA may feel daunting. But for most community-run services, especially those that do not serve children and take steps to prohibit illegal material, compliance is both achievable and proportionate.
This post offers clarity, context, and practical advice for Fediverse administrators aiming to meet their obligations without sacrificing their independence or community values. It is aimed at small, low-risk services.
Disclaimer: This article is intended for informational purposes only. It is not legal advice. Platform operators are encouraged to seek independent legal counsel if they are unsure about their obligations under UK law.
What is the Online Safety Act?
The OSA imposes legal duties on online platforms that are accessible in the UK. These include:
- Preventing the distribution and amplification of illegal content (for example, terroristic material, child sexual abuse content, incitement to violence or hate),
- Protecting children from content that is harmful, even if not illegal, and
- Publishing terms of service and offering user redress for reported harms.
Ofcom, as regulator, is required by law to act proportionately. According to the UK Government’s official response to a public petition calling for the Act to be repealed:
“Proportionality is a core principle of the Act and is in-built into its duties. As regulator for the online safety regime, Ofcom must consider the size and risk level of different types and kinds of services when recommending steps providers can take to comply with requirements.”“Once providers have carried out their duties to conduct risk assessments, they must protect the users of their service from the identified risks of harm. Ofcom’s illegal content Codes of Practice set out recommended measures to help providers comply with these obligations, measures that are tailored in relation to both size and risk. If a provider’s risk assessment accurately determines that the risks faced by users are low across all harms, Ofcom’s Codes specify that they only need some basic measures…”
These basic measures include:
- Easy-to-find, understandable terms and conditions,
- A complaints tool backed by a process for responding to reports of illegal content,
- The ability to remove content that is illegal or violates your rules,
- A named individual responsible for compliance who can be contacted by Ofcom if needed.
(Source: UK Government Petition Response, July 2025)
Why Regulation (Even Imperfect Regulation) Matters
The OSA was developed in response to real-world harms, including grooming, abuse, extremist content, and hate speech. These forms of harm disproportionately affect marginalised and vulnerable users. The law is far from perfect. Regulation is often blunt, shaped with large platforms in mind, and implemented unevenly. But it is still an attempt to make digital spaces safer and more accountable.
The individuals working at Ofcom are thoughtful and well-intentioned, for the most part working to create a safer internet. They did not write the law, but they are empowered to enforce it.
Jaz attended TrustCon 2025, where Ofcom, along with many other national internet safety regulators, had a strong presence. He met with representatives, attended sessions, and had informal conversations about how the regulator views decentralised services.
Ofcom understands that independent platforms serve community needs and do not necessarily pose the same systemic risks as profit-driven networks. Their message was consistent: low-risk, volunteer-run services are not the focus of enforcement. If you are operating in good faith, acting proportionately, and keeping your community safe, you are already on the right path.
Practical Compliance Steps for Fediverse Operators
If you run a small to medium service and neither host illegal content nor serve children, here is a straightforward path to compliance:
1. Complete an Illegal Harms Risk Assessment
This is the cornerstone of your legal duty. You must assess your platform for risks related to illegal harms. For most Fediverse instances, this risk is low, and that is acceptable, so long as you document that conclusion.
You are not required to publish your risk assessment, but doing so can demonstrate transparency and good faith. You can adapt these published examples:
- Neil Brown (UK-based technology lawyer and self-hosting Fediverse provider) has shared six assessments he’s completed
- toot.wales is a UK-based Mastodon service that has published their Illegal Harms Assessment.
2. Prohibit Illegal Content in Your Rules
Your rules, community guidance, or terms of service should explicitly ban illegal material. These are the “priority illegal content” items:
- Terrorism
- Child Sexual Exploitation and Abuse (CSEA)
- Grooming
- Image-based Child Sexual Abuse Material (CSAM)
- CSAM and CSEA URLs
- Encouraging or assisting suicide
- Hate
- Harassment, stalking, threats and abuse
- Controlling or coercive behaviour
- Drugs and psychoactive substances
- Firearms, knives or other weapons
- Human trafficking
- Unlawful immigration
- Sexual exploitation of adults
- Extreme pornography
- Intimate image abuse
- Proceeds of crime
- Fraud and financial offences
- Foreign interference
- Animal cruelty
Additionally, prohibit “other illegal content” (Ofcom’s “non-priority illegal content”), and consider prohibiting “bullying content, eating disorder content, self-harm content or suicide content” as well, even if you do not allow children on your service. These types of content are not necessarily illegal for adults, but they are listed in the OSA as priority content harmful to children. If your service is or may be accessed by children, you should take additional steps to mitigate exposure to these materials.
Guidance and example language on these harms can be found by exploring the Actors, Behaviours, and Content sections of the IFTAS Connect Moderator Library.
Example community rules are available:
3. Respond to User Reports
You must offer users a way to report illegal content. Most Fediverse platforms feature a “report content” and/or “report account” function. This meets the need. If not, this could be a web form or a support email address. What matters is that you respond and take appropriate action. If you are using moderation tools common in Mastodon or other Fediverse software, your instance likely already supports the necessary report functions.
There is no requirement to proactively monitor or scan content for small, low-risk services.
4. Nominate a Contact for Ofcom
Your service or web site should name an individual, or a role such as compliance lead, who can respond to regulatory enquiries. Make sure Ofcom knows who to email if they have questions. This could be a general email like admin@yourdomain.social, as long as someone reliably monitors it. Listing this in your Terms of Service and on your website will help regulators find the right point of contact.
If you get contacted by Ofcom, contact IFTAS, we will be happy to help small and medium Fediverse providers.
5. Publish Clear Terms of Service
Your terms of service should be understandable, accessible, and reflect your moderation approach and safety measures. There are several reliable resources to help you build or adapt yours:
- OnlineSafetyAct.co.uk ToS Template: developed by Neil Brown, a UK-based lawyer who has contributed many helpful resources to the sector
- toot.wales Terms of Service: a practical example adapted from Neil Brown’s copyleft terms
- Sample Fediverse Terms for Registered Users: detailed example maintained by Neil Brown, suitable for decentralised platforms
- Terms of Service for Everyone: predates the OSA but helpful guidance for Fediverse providers from the Law Office of August Bournique
IFTAS can support operators (in a non-legal advisory capacity) in reviewing and tailoring these documents to suit the needs of their communities.
6. Consider Whether Children Are Likely to Use Your Service
For the purposes of the OSA, “child” means a person under the age of 18. If your service does not target children and does not host a significant number of children, your duties regarding child safety are more limited.
The OSA guidance is clear:
“Services that do not have highly effective age assurance in place must assess whether children are likely to be on the service…”
Here’s the Child Access Assessment:
Most general-purpose Fediverse platforms can truthfully state:
- They do not have a significant number of child users.
- Their service is not of a kind likely to attract a significant number of children.
If you can truthfully answer no to both Stage 2 questions, your child access assessment is done. Here’s the toot.wales example.
This reasoning should be documented in your risk assessment. There is no official number for what constitutes “significant”, because while there are numbers like 700,000 UK children and 7 million UK children variously quoted, if 99 of your 100 users are children, that is also significant. Regulations are never as simple as you might like them to be.
Most likely if you are a small, low-risk Fediverse provider, you do not have a significant number of children using or wanting to use your service. If you do…
What If Your Service Does Serve or Attract Children?
If you operate a service that is designed for or significantly used by children, or if your platform offers features likely to attract children, your responsibilities under the OSA are more complex. In these cases, you are likely to require highly effective age assurance measures, as well as specific protections against harmful but legal content.
Some considerations for these platforms include:
- Age Assurance: You must implement robust methods to estimate or verify users’ ages. This may include self-declaration combined with additional signals, third-party tools, or parental consent mechanisms.
- Content Filtering: Platforms should consider technical and policy-based approaches to limit children’s exposure to high-risk content, even if that content is legal.
- Child-Friendly Design: Interfaces, terms, and moderation systems should be understandable to children. Ofcom is expected to publish further guidance under its Children’s Safety Code of Practice.
- Parental and Guardian Support: Provide mechanisms for parents or guardians to report concerns or manage a child’s access to your service.
Platforms in this category should seek specialist guidance, from a legal or child safety expert, as the requirements are stricter and the regulatory expectations higher. To put it simply, if you want to run an internet service for children in the UK, you are going to need a lawyer.
In Summary
Compliance with the Online Safety Act does not mean giving up autonomy or radically changing how you operate. It means documenting your risks, being clear about your rules, and responding responsibly to reports of illegal behaviour. This is something most responsible Fediverse administrators are already doing.
By taking a few proportionate steps, you can quickly show both legal compliance and a commitment to community care. You protect yourself and your users, and you strengthen your community’s resilience.
Here’s what that looks like in practice:
- Using one of the available examples if needed, conduct a harms assessment.
- Prohibit illegal content.
- Respond to reports, take it down when it’s reported to you.
- Publish a contact for Ofcom (and other regulators) to find you if needed.
- Using one of the available examples if needed, publish a clear terms of service.
- Assess how likely it is that you host or will host a significant number of children.
If you are a small, low-risk service, this is probably four to eight hours of effort. This is not a get-out-of-jail-free card, but it is a good faith first step effort that you can use to demonstrate to Ofcom (or others) what you are doing to consider safety on your service.
IFTAS is here to support you. If you are acting in good faith and prepared to demonstrate that with basic documentation, then you already have the most important protections in place. If Ofcom ever asks, you will have a clear and reasonable explanation, grounded in public guidance and community safety principles. IFTAS maintains a closed community group for active volunteer moderators, if you’re not a member, request an account today.
Resources
- ofcom.org.uk/siteassets/resour…
- onlinesafetyact.co.uk/
- russ.garrett.co.uk/2024/12/17/…
- buttondown.com/indie-and-commu…
- connect.iftas.org/library/lega…
- connect.iftas.org/groups/legal… (members only)
Please Note
- Jaz-Michael King, Director of IFTAS, is also the administrator of the toot.wales service, and it is for this reason their resources are listed above. If you have made your Assessments, Terms of Service, Community Guidance/Rules public, specifically to address OSA, please let us know so we can add them to this page and the Moderator Library. Any and all shared policies and documents should be considered advisory, and represent a best-faith effort to share how others are approaching this issue. The inclusion of toot.wales is not an endorsement of one service over another. All shared materials are presented to support community learning and replication.
- This page (and IFTAS Connect’s resources page) will be updated as and when we get feedback on the above. If you think we’ve made an error or feel we need to clarify something, please contact Jaz directly: mastodon.iftas.org/@jaz
Petition: Repeal the Online Safety Act
We want the Government to repeal the Online Safety act.Petitions - UK Government and Parliament
Graham Smith
in reply to IFTAS Blog • • •IFTAS Blog
in reply to Graham Smith • • •Graham Smith
in reply to IFTAS Blog • • •IFTAS Blog
in reply to Graham Smith • • •