World News

UK Companies Legally Obligated to Prevent Children from Accessing ‘Harmful Content’


The Communications regulator Ofcom announced that it will seek a court order to block websites in the UK if they do not adhere to the Online Safety Act.

Under internet regulations, companies are now obligated to restrict children’s access to “harmful content” in the UK.

The Online Safety Act (OSA), dubbed the world’s first online safety law, was enacted in October 2023. However, the responsibilities regarding child safety measures for sites and apps will come into effect this July.

Ofcom, the UK communications regulator enforcing the act, outlined more than 40 practical steps for tech companies on April 24 to prevent minors from encountering harmful content related to suicide, self-harm, eating disorders, and pornography.

Providers of services accessible to UK children have until July 24 to conduct and document an assessment of the risks their service poses to children.

This includes creating social media feeds with less harmful content and safeguards against stranger contact.

If companies do not meet their new obligations, Ofcom has the authority to levy fines and, in serious cases, seek a court order to block the site or app in the UK.

Dame Melanie Dawes, Ofcom’s chief executive, emphasized that the changes represent a significant step for child protection online.

Providers of the most risky services like pornography must implement strong age verification methods to identify child users while enabling adults to access legal content.

Previously, Ofcom outlined a list of methods deemed highly effective for age verification.

These methods include open banking, photo ID matching, facial age estimation, and more.

Failure to implement age verification processes by July 25 will result in enforcement actions by Ofcom, including fines up to 10% of a company’s revenue or £18 million.

Ofcom may also seek court orders to disrupt third parties’ services that support non-compliant websites.

Forums

Social media platforms and user-to-user service providers must actively monitor their platforms for harmful content under the act.

Sites with user interaction, including forums, must complete an illegal harm risk assessment.

However, numerous smaller UK websites, ranging from cycling forums to forums for divorced fathers, are feeling the regulatory pressure, leading to closures.
These sites must keep detailed risk assessment records and evaluate potential harm levels.

While some offenses like terrorism are clear-cut, others such as hate offenses are more challenging to manage on large forums.

Popular cycling forum London Fixed Gear and Single Speed closed in December 2024 due to regulatory burdens.

U.S. sites are also blocking UK users due to the legislation.

The Open Rights Group urged the government to exempt small community websites from the act.

Concerns have been raised over the extensive regulations outlined by Ofcom, potentially favoring large companies over small ones.



Source link

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.