top of page

TikTok’s Addictive Design and the Digital Services Act

  • Writer: Ayesha Ansar
    Ayesha Ansar
  • Apr 13
  • 3 min read

The Digital Services Act (DSA) is the flagship legislation of the European Union intended to make online platforms safer and more responsible. The essence of it is simple enough: Internet resources should recognize, evaluate, and minimize the dangers that their services pose, particularly dangers to children and vulnerable users.



On 6 February 2026, the European Commission preliminarily concluded that TikTok violated the Digital Services Act because of its design features that make users addicted. Digitally, with regards to safety and platform governance, this discovery demonstrates long-standing concerns on how engagement-based platforms are managed in the regulatory context of the EU.


This is still a developing case, and initial results already provide valuable information on the implementation of the DSA.


What the Commission Found

The investigation conducted by the Commission shows that Tik Tok failed to properly examine the risks associated with its features like infinite scroll, autoplay features, push notifications, and a highly personalised recommender system. These capabilities continuously reward the users with new content, motivation to continue using them longer and loss of self-control.


Scientific studies indicate that these patterns of design may produce compulsive behaviour and have a detrimental effect on the mental health. Within the DSA, platforms must evaluate these risks as they have systemic risk requirements. As per the Commission, Tik Tok did not do so.


Disregard of Distinct Warning Signs.

One of the main issues that regulators are concerned about is the fact that Tik Tok seems to have ignored strong signals of harmful use. These are the frequency of opening the app, the number of minutes that the minors spend on the platform during the night, and the trends of the excessive daily usage.


About 170 million users access Tik Tok in the EU and a significant portion of them are children. It is stated to be the most visited platform after the midnights among users aged between 13 and 18 years and that approximately 7 percent of children between the age of 12 and 15 years spend four to five hours a day using the app. DSA compliance wise, these numbers ought to have elicited more risk evaluation and mitigation actions.


Why Current Protection Measures are not Enough.

Screen-time tools and parental controls have been cited by Tik Tok as a safety measure. Nonetheless, the initial opinion of the Commission is that these actions are ineffective. Screen-time products can be disregarded, whereas parental controls need time, expertise, and parental attention.


The risk mitigation measures in place under the DSA should be reasonable, proportionate and effective. Placing the blame on the users or parents fails to do any good.


Policy Takeaways

• Under the DSA, features like infinite scroll and autoplay must be treated as safety issues, not neutral design choices.

• Ignoring data on excessive use by minors, especially at night, weakens DSA compliance.

• Easy-to-dismiss screen-time tools and weak parental controls do not meet DSA standards.

• Safety measures must be structural and default, not optional.

• Real penalties and design changes are essential to protect users online.


Future Action and Relevance of this Case.

TikTok now has the right to respond to the Commission’s findings, and the European Board for Digital Services will be consulted. In case the preliminary opinions are proved right, Tik Tok may be fined up to 6 percent of its annual turnover all over the world and may be asked to alter the layout of its service.


This case is not limited to Tik Tok. It is an indication that platforms may become responsible not just of content, but also of the systems they create and the behaviours of which they are encouraging under the Digital Services Act.


Comments


  • G&D Collective Instagram
  • G&D Collective Linkedin
  • Facebook
bottom of page