top of page

Understanding the Digital Services Act (DSA)

  • Writer: Ayesha Ansar
    Ayesha Ansar
  • Feb 26
  • 4 min read

The Digital Services Act (DSA): The Online Accountability Transformative EU Law. - Co-written by Zoya Baig


Digital Services Act (DSA) is a landmark regulation embraced by the European Union in order to modernize the digital services, platform, and intermediary operations in the Digital Single Market of the EU. It was formally adopted in 2022 and came into effect in 2023-2024, superseding aspects of the prior e-Commerce Directive and creating EU wide harmonised rules on online intermediary responsibilities, transparency and user safety. It governs a large variety of online intermediaries such as social media networks, marketplaces, search engines, and app stores that are available to users in the Union.

In its essence, the DSA is aimed at providing a democratic, less risky, and open online space, that would not violate upon the primary rights of citizens and would enable them to gain control over online communication. It was created as a reaction to the accelerating digitalisation of society and an increasing concern regarding the existence of illegal content, misinformation, poorly understood algorithmic processes, and the unquestioning power of platforms. The regulation incorporates the core EU principles into digital governance, including freedom of speech and consumer protection as well as the safety of children and the integrity of democracy.


Major Characteristics of the DSA

The differentiated regulatory architecture, depending on the size of the platform and systemic impact, is one of the most important DSA innovations. The strictest requirements apply to e.g., Very Large Online Platforms (VLOPs) and Very Large Online Search Engines (VLOSEs), i.e. those that have over 45 million monthly users in the EU (or about 10% of the EU population) which must undergo annual systemic risk assessment, be independently audited, and have the obligation to mitigate the identified harms to the society.


Accountability and Transparency

At the core of the DSA is the improved transparency system: the platforms are now required to share the moderation policies, the advertising repositories, as well as to reveal the risk evaluation algorithms. This also involves the need that compels it to publicly report combined data about content takedowns, automated detection accuracy, and appeals results. EU has also added such tools as the DSA Transparency Database that brings together anonymized, self-reported moderation actions, a first of its kind in the world, to allow civil society, researchers, and regulators to examine decisions made by platforms.


Transparency requirements include annual audits and risk measurements, in which platforms must formulate and alleviate systemic risks, including hate speech propagation, disinformation, and harms to minors, but the way these measurements are conducted and implemented is challenging and not yet fully developed.


User Authorizations and Controls

DSA enables users to have redress and appeal procedures against content moderation actions on the accounts or posts of users. This involves in-house procedure of complaints by sites and out-of-court dispute resolution wherein, dispute resolution methods are quick and cheaper when compared to litigation.


The rule also prohibits targeted advertising to children, and it also outlaws profiling on sensitive personal information, which strengthens the measures of vulnerable users. The sites should also offer easily accessible, transparent reporting of the illegal content and notify the users of the results.


Whistleblower Protection, Cooperation

The DSA creates a whistleblower tool which enables the insiders, i.e. the employees, contractor or third parties, to report malicious practices or compliance breaches in large platforms safely. This strengthens the enforcement mechanisms by directing information that would otherwise be unreachable to regulators through powerful data protection and anonymity assurances.


Besides, the regulation establishes a multi-level cooperation structure between the European Commission, national Digital Services Coordinators, and the DSA Board to coordinate enforcement and intelligence sharing among the Member States.


Practical Implementation

The DSA has demonstrated concrete effects on EU digital governance two years after the implementation. An example is that platforms have overturned almost 50 million content moderation decisions following users who invoked DSA rights to appeal account suspensions or deleted content itself, a significant change of user agency and responsibility. A significant percentage of appeals have seen platforms reverse the decision and reinstated content or reinstated accounts and out-of-court adjudicators have overturned on more than 50 percent of appeals.


The real world behaviour is now being followed by the enforcement actions. Recent investigations into sites like Shein stipulated by the European Commission as having the potential to violate the DSA in regards to both illegal product offerings and the addictive nature of design, exemplifying the application of the DSA to the elements of behavioural design, an area that was previously a regulatory blind spot, in addition to illegal content and transparency requirements.


Simultaneously, initial results of regulatory investigations suggest that applications such as TikTok might have breached the fundamental DSA principles of user safety and addictive designs, and have to be redesigned or pay considerable penalties related to international user base.


Limitations

The DSA has been criticized across a number of fronts in spite of its pioneering scope. To begin with, there are enforcement and reporting loopholes, as much of the platforms mostly implement their terms and conditions instead of EU or national laws, which restricts the clarity of the law.


The data quality of transparency is not fixed and major difficulties continue to persist in aligning systemic risk evaluation and audit results on platforms. The study of the civil society has pointed out the discrepancies in the reporting set-ups of platforms and an absence of comparable, similar data that cannot be compared thus obstructing construction of strong external oversight.


The critics also complain that the concentration on the user control and appeal systems in the DSA, although required, does not provide enough focus on underlying algorithmic harms inherent to recommendation systems and behavioural design and that future versions should be even more specific in technical aspects.


Policy Recommendations

The policymakers ought to take into account:

  • Standardisation of transparency reporting formats across platforms to foster data comparability and minimise ambiguity in interpretation.

  • Enhancement of audit procedures through the clear sampling and legal contents analysis policies to enhance rigour of systemic risk analyses.

  •  Increased collaboration with independent scholars and the civil society through access to non-aggregated platform metadata in a secure research data environment as soon as possible.


To sum up, the Digital Services Act is an ambitious endeavour to re-establish the balance of power in the digital realm - with a focus on user rights, accountability and transparency. Though there are still issues with implementation, its initial effects are the indication of a significant change in the global frameworks of platform governance, creating legal paradigms that are likely to impact regulatory discourse in the Europe and the regions well beyond.

Comments


  • G&D Collective Instagram
  • G&D Collective Linkedin
  • Facebook
bottom of page