Italy penalizes TikTok with $11 million fine for neglecting minor protection

Italy penalizes TikTok with $11 million fine for neglecting minor protection

Italy's regulatory authority imposes a €10 million ($11 million) penalty on TikTok for its failure to regulate the dissemination of content that posed risks to the safety of minors and other susceptible individuals.

Italy’s competition authority has imposed a €10 million ($11 million) fine on TikTok for not effectively managing the dissemination of content that poses a risk to the safety of minors and other vulnerable individuals.

The antitrust agency, AGCM, highlighted on Thursday that TikTok, a platform owned by China’s ByteDance, neglected to consider the unique vulnerabilities of teenagers, including a tendency to imitate group behaviors.

The video-sharing platform has been criticized by the watchdog for not doing enough to prevent harmful content from spreading. According to the watchdog, the platform's algorithms make it easy for such content to keep resurfacing and being shown to users.

One example of harmful content mentioned by the AGCM is the 'French Scar' trend on TikTok. This trend involves users pinching their cheeks to create a bruise that lasts, as reported by Reuters last month.

A TikTok spokesperson told CNN that it disagreed with the AGCM’s decision. “We long ago restricted visibility of this content to (under-18s).”

This is a developing story and will be updated.

Angelica Chiara Yazbeck contributed reporting.

Editor's P/S:

The Italian competition authority's fine against TikTok for failing to protect minors from harmful content is a crucial step towards ensuring the safety of young people online. The platform's algorithms, which can perpetuate the spread of dangerous content, pose significant risks to vulnerable users. The fine serves as a warning to social media companies that they must prioritize the well-being of their underage users by implementing robust safeguards against harmful content.

TikTok's defense that it restricted visibility of such content to under-18s is insufficient. The platform must take proactive measures to prevent harmful trends from gaining traction in the first place. It is essential that social media companies invest in content moderation, age verification, and educational resources to protect young users from the potential dangers lurking online.