Supporting the DSA Implementation: How the EC is taking the lead on ensuring the effectiveness of the new rules and the protection of users online
Follow VerifyMy on LinkedIn to keep up to date with the latest safety tech and regulatory developments.
The Digital Services Act (DSA) has recently reached full force as of February 17, 2024. However, big tech players have been under its scope since August 2023. The European Commission (EC) has been engaging proactively with platforms, seeking more visibility into their practices and bringing more clarity to the public, a constructive approach that we warmly welcome.
At the end of last year, formal proceedings were initiated against X (formerly Twitter), assessing potential breaches related to risk management, content moderation, dark patterns, advertising transparency, and data accessibility for researchers. Earlier this week TikTok came under formal investigation in areas linked to the protection of minors, advertising transparency, data access for researchers, as well as the risk management of addictive design and harmful content.
The DSA regulates online platforms and digital services and its primary objectives are to establish clearer rules for digital intermediaries, enhance user safety and protection, foster innovation, and ensure a fair and competitive digital market. It aims to address various challenges arising from the rapid evolution of online services, including disinformation, hate speech, and illegal content dissemination. The DSA seeks to protect users, including consumers and businesses, by imposing obligations on platforms to mitigate risks associated with their services while preserving fundamental rights such as freedom of expression. Services in scope of the DSA encompass a wide range of online intermediaries, including social media platforms, online marketplaces and search engines, with the aim of creating a safer and more transparent digital environment.
In the past few weeks we have seen extensive coverage of how the DSA would be implemented in practice and we are very supportive of the EC actions to engage and enforce these rules, particularly in the context of safeguarding children online. Working towards enforcing practical solutions will be critical, and effective implementation is the key to the success of these regulations and to the protection of users.
We strongly believe in the existence of effective and privacy-preserving tools to safeguard young online users, including age estimation solutions for creating age-appropriate experiences. We are committed to working closely with regulators, platforms, industry stakeholders, and experts to foster a safer online environment.