• The European Commission has launched formal legal proceedings to investigate TikTok’s compliance with the EU Digital Services Act regarding various aspects including minor protection, advertising transparency, and harmful content risk management.
  • The decision was based on a preliminary analysis of TikTok’s risk assessment report and its responses to information requests, with the European Commissioner for Internal Market emphasising the need for a thorough investigation.
  • If TikTok is found to violate DSA rules, its owner, ByteDance, could face fines up to 6% of its global turnover, with the investigation focusing on areas such as systemic risk evaluation, privacy protection for minors, and transparency measures on the platform.

On February 19th, the official website of the European Commission announced that it has initiated formal legal proceedings to investigate TikTok‘s compliance with the EU Digital Services Act (DSA) regarding minor protection, advertising transparency, research data access, addiction design, and harmful content risk management.

The European Commission stated that the decision to launch the formal procedure was based on a preliminary analysis of TikTok’s risk assessment report sent in September 2023 and its responses to information requests.

Thierry Breton, the European Commissioner for Internal Market, indicated that he made this decision after analysing the risk assessment report of the short video application and its responses to information requests.

Also read: TikTok’s drastic change: A second round of share buybacks

There will be a thorough investigation ahead

The European Commission emphasised that it will prioritise a thorough investigation into TikTok and that the initiation of the formal procedure does not prejudge its outcome. The DSA does not set a statutory deadline for concluding formal procedures, and the duration of the in-depth investigation depends on several factors, including the complexity of the case, the degree of cooperation between relevant companies and the European Commission, and the exercise of the right to defense. A TikTok spokesperson responded publicly, stating that they have introduced features and settings to protect teenagers and do not allow children under 13 to use the platform. TikTok mentioned in a company statement obtained by the financial news outlet First Financial that “this is a problem that the entire industry is working hard to solve” and will continue to cooperate with experts and the industry to ensure the safety of teenagers on TikTok, looking forward to the opportunity to explain this work in detail to the Commission.

ByteDance may face fines of up to 6% of its global turnover

If TikTok is found to violate DSA rules, its owner, ByteDance, could face fines of up to 6% of its global turnover. In response to the formal investigation launched by the European Union, ByteDance declined to comment to First Financial reporters, stating that TikTok’s response should be considered.

According to the content on the European Commission’s website, the litigation process will focus on areas such as a company’s compliance with and assessment of DSA obligations related to evaluating and mitigating systemic risks; whether appropriate measures have been taken to ensure a high level of privacy, safety, and protection for minors, particularly in terms of default privacy settings for minors, as part of their recommendation system design and operation; whether a searchable and reliable repository for advertisements displayed on TikTok has been provided; and whether measures have been taken to increase platform transparency.

Specifically, the European Union will assess whether the actual or foreseeable negative effects generated by the design of the TikTok system (including algorithm systems) may stimulate addictive behaviour and produce the so-called “rabbit hole effect” to protect the fundamental rights of children’s physical and mental health; In addition, the European Union will also assess whether TikTok has implemented mitigating measures in age verification tools used to prevent minors from accessing inappropriate content, and the investigation will also involve suspected defects in Article 40 of the DSA, which allows researchers to access public data.

If the above investigations are confirmed, TikTok may be deemed to violate Articles 34(1), 34(2), 35(1), 28(1), 39(1), and 40(1,2) of the DSA. The European Commission also stated that after the formal initiation of the procedure, it will continue to collect evidence, such as by sending additional information requests, conducting interviews, or inspections.

In addition, the initiation of the formal procedure will enable the European Commission to take further enforcement measures and accept any commitments made by TikTok to address the issues raised in the litigation.