World News

Tech companies need to update algorithms to safeguard children or risk substantial penalties, warns Ofcom.


Recent guidance by the communications watchdog has raised concerns over the effectiveness of current moderation efforts in protecting underage users.

Ofcom has issued a warning to online platforms, including popular social media sites, stating that they will face significant fines and enforcement actions if they do not make changes to their algorithms to prevent harmful content from being recommended to children.

A draft guidance released by the UK communications watchdog outlines 40 safety measures, such as strict age checks, to safeguard underage users.

This means that popular social media platforms like Facebook, Instagram, and Snapchat will need to implement effective age verification processes. In some cases, this may involve restricting children’s access to the platform entirely.

Technology Secretary Michelle Donelan emphasized the need for platforms to implement real-world age verification processes and address algorithms that expose young users to harmful content online.

Ofcom suggests various age verification methods, including checking bank records with user consent, photo ID matching, facial age estimation, and credit card checks.

Related Stories

Moreover, the regulator has instructed content providers to filter out harmful content from children’s feeds by adjusting the algorithms that drive personalized recommendations.

Ofcom’s Chief Executive Dame Melanie Dawes stated that the guidance goes beyond current industry standards and warned that the watchdog will not hesitate to use enforcement measures against platforms that do not comply.

Under the Online Safety Act, the draft Children’s Safety Codes of Practice were introduced to ensure the safety of underage users on online platforms. Once approved by Parliament, the Codes will be enforced, holding tech companies accountable for the safety of their platforms.

The Chief Executive of the NSPCC, Sir Peter Wanless, emphasized that tech companies have a legal obligation to protect underage users on their platforms.

Ofcom noted the prevalence of harmful content affecting children online, citing research that shows 62% of children aged 13-17 have encountered harmful content in a four-week period.

Bereaved Families

Parents like Ian Russell, who lost his daughter Molly to suicide due to exposure to disturbing content online, are advocating for stronger measures to protect children on social media platforms.

“The regulator has proposed some important and welcome measures, but its overall set of proposals need to be more ambitious to prevent children encountering harmful content that cost Molly’s life,” Mr. Russell stated.

Ofcom plans to finalize the Children’s Safety Codes of Practice within a year, urging platforms to conduct children’s risk assessments and adhere to the guidance provided in the draft.



Source link

TruthUSA

I'm TruthUSA, the author behind TruthUSA News Hub located at https://truthusa.us/. With our One Story at a Time," my aim is to provide you with unbiased and comprehensive news coverage. I dive deep into the latest happenings in the US and global events, and bring you objective stories sourced from reputable sources. My goal is to keep you informed and enlightened, ensuring you have access to the truth. Stay tuned to TruthUSA News Hub to discover the reality behind the headlines and gain a well-rounded perspective on the world.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.