EU Aims to Regulate Algorithms on Popular Social Media Platforms Such as YouTube, TikTok, and Snapchat due to Concerns Over Harmful Content
The European Commission has requested that platforms disclose how their algorithms recommend content as part of an inquiry.
The European Commission has issued official requests to YouTube, Snapchat, and TikTok, asking for more information on how their secretive algorithms recommend content to users amid an inquiry into the role these recommender systems may play in amplifying harmful content.
Specifically, the commission is seeking information from YouTube and Snapchat on the functioning of their recommender systems and the criteria guiding content selection. Both platforms have been asked to furnish detailed insights into how their algorithms contribute to heightened risks related to civic discourse, electoral integrity, protection of minors, and mental health, especially concerning addictive behaviors and content “rabbit holes.”
The commission expressed concerns about how TikTok’s algorithms could be leveraged to sway public opinion or spread disinformation, particularly during elections. TikTok has been questioned about its measures to combat manipulation by malicious entities and mitigate risks regarding elections, media diversity, and civic discourse.
Under the DSA, platforms with over 45 million monthly active users in the European Union must implement stringent user protection measures. As part of these measures, these large online platforms must assess the risks their systems pose to users, particularly concerning harmful content and user safety. They are also mandated to take steps to mitigate these risks, with failure to comply potentially resulting in significant fines.