The European Commission has taken fresh action against social media platform ByteDance’s TikTok, saying its design encourages compulsive use and breaches the European Union’s Digital Services Act (DSA) — one of the bloc’s strongest tech regulations. Regulators are now demanding changes that could fundamentally alter how users consume short-form video content.
At issue is TikTok’s “infinite scroll” feature — the continuous feed that loads new videos without user interaction — as well as autoplay and highly personalised recommendations. According to the Commission’s preliminary findings in early February 2026, these design elements push users into a “reward loop” that can encourage prolonged use, a behaviour many experts link to compulsive screen time, especially among children and teenagers.
Current screen-time tools on TikTok — such as one-hour prompts for younger users and optional time limits — were judged insufficient by regulators. Officials said these protections are easy to dismiss and do not prevent long continuous viewing, meaning the platform still falls short of the DSA’s requirements to manage risks to physical and mental wellbeing.
Under the Digital Services Act, very large online platforms like TikTok must assess and mitigate systemic risks associated with their services, including harmful user behaviour and addictive design patterns. The Commission’s analysis found TikTok’s own risk assessment and mitigation measures are not effective against the so-called “compulsive use” the infinite scroll is thought to encourage.
Now the Commission has told ByteDance it must redesign TikTok’s core user experience, potentially including disabling infinite scroll after a certain period, introducing effective screen-time breaks, and adjusting recommendation systems so that they do not maximise compulsive engagement. If TikTok fails to comply, the EU could issue a non-compliance decision and impose fines of up to 6 % of the platform’s global annual revenue — a penalty designed to ensure serious enforcement of the DSA.
Why it matters
This move is one of the first major real-world tests of the Digital Services Act, which entered into force to give the EU powerful tools to regulate big online platforms. The initiative marks a shift in regulatory focus from purely data privacy and content moderation to user experience and behavioural design. Lawmakers say protecting users from addictive features is a key objective of the DSA and essential for safeguarding public health, especially among children and vulnerable populations.
Trend impact
If TikTok implements the required changes, it could set a global precedent for how social media platforms balance engagement-boosting features with user wellbeing obligations. Other big platforms — from video services to social networks — may also face increased scrutiny of their design choices. For tech companies operating in the EU, the ruling underscores that user safety and risk mitigation are no longer optional but are enforceable legal obligations under European law. Regulatory pressure like this is likely to influence how digital entertainment products evolve worldwide in the coming years.