Online Safety Act 2023

The Online Safety Act 2023 is a UK law aimed at safeguarding both children and adults on the internet. It is built to safeguard messaging and social media apps as well as search engines to ensure safety for its users, imposing additional responsibilities on these companies. That means they are required to establish policies and procedures to minimize the risks of their services being used for illegal activities. 

Under the Act, platforms must be clear about what harmful content they allow and give users more say in what they see. Companies are required to remove illegal content once it is identified. Additionally, service providers must stop children from seeing age-inappropriate content and offer easy ways for parents to report such safety issues.

Frequently asked questions

1

Who Does the Act Apply to?

Arrow

The Online Safety Act 2023 applies to online platforms where users can either post content or interact. Examples include:

  • Social media
  • Cloud storage
  • Video sharing
  • Forums
  • Image/message boards
  • Dating apps
  • Online gaming platforms
  • Adult content sites
  • Messaging services
  • Search services (any online business with search functionality)

It covers services connected to the UK, including those outside the UK but used by UK residents or marketed to them.

2

What is "User-to-User Service" According to the Online Safety Act 2023?

Arrow
3

What Measures Should Platforms Integrate to Comply with the Online Safety Act?

Arrow
4

Who is the Regulator of the Online Safety Act?

Arrow
5

What are the New Offenses Introduced by the Online Safety Act?

Arrow

Save costs by onboarding more verified users

Join hundreds of businesses that successfully integrated iDenfy in their processes and saved money on failed verifications.