Photo:

Investigating addictive algorithms, harmful content, privacy concerns, and beyond.

TikTok Faces EU Investigation for Potential Digital Services Act Violations

TikTok is under scrutiny by the European Union (EU) for potential breaches of the Digital Services Act (DSA) related to the safety of minors and other concerns. The investigation will specifically focus on addictive algorithms, the “rabbit hole effect,” age verification issues, and default privacy settings. The European Commission has also announced its examination of ad transparency and data access for researchers, as stated in a press release.

The probe is primarily centered around the privacy and safety of minors. The European Commission aims to assess the negative aspects of TikTok’s design and algorithms, including addictive behavior and the potential for harmful content due to the “rabbit hole effect.” The objective is to mitigate risks and protect the physical and mental well-being of individuals, as well as uphold the rights of children, according to the EC.

As part of the investigation, TikTok’s age verification tools will be evaluated to ensure they effectively prevent minors from accessing inappropriate content. Additionally, the social media platform will be required to implement high levels of privacy, safety, and security for minors through default privacy settings, similar to the measures taken with Meta’s Instagram and Facebook.

The European Union is also examining TikTok’s compliance with DSA obligations to provide a searchable and reliable repository for advertisements. Furthermore, the EU is investigating potential shortcomings in researcher access to TikTok’s publicly accessible data, as mandated by the DSA.

Following the opening of the proceedings, the European Commission will continue to gather evidence. The procedure allows for further enforcement actions, including interim measures and non-compliance decisions.

TikTok, along with its parent company ByteDance, has already made significant changes for EU users to comply with the DSA. These changes include giving users the choice to opt-out of algorithm-powered personalized content on their For You Page (FYP). TikTok has also introduced new options for reporting harmful content and has discontinued personalized ads for EU users between the ages of 13 and 17.

In addition to this investigation, the EU is separately looking into TikTok and Meta’s efforts to address illegal content and misinformation related to the ongoing Middle East conflict. Meta was previously fined $414 million for its personalized ads practices, and there are rumors that both Meta and TikTok may introduce paid tiers to allow users to opt-out of personalized ads. Civil rights groups have urged the EU to reject these plans, labeling them as “pay for privacy.”

External Links:
– European Commission Press Release: https://ec.europa.eu/commission/presscorner/detail/en/ip_24_926

Leave a Reply

Your email address will not be published. Required fields are marked *