Ireland's Regulator Launches Investigations Into TikTok and LinkedIn Over DSA Violations Concerns
A probe has been initiated by Ireland's media regulator, Coimisiún na Meán, into two prominent social media platforms – TikTok and LinkedIn – over concerns they may be violating the European Union's Digital Services Act (DSA). The focus of this investigation centers on how these platforms present and implement their reporting tools for illegal content, with regulators finding "deceptive interface designs" that could render them less effective in removing illicit material.
According to the regulator, there are issues with the design and presentation of these reporting mechanisms. Specifically, it was found that users may be misled into believing they're reporting content as being against its terms when, in fact, it's simply a matter of violating the provider's conditions. This lack of clarity raises concerns about how effectively these platforms can handle problematic content.
The DSA emphasizes user rights and requires providers to have easily accessible and user-friendly reporting mechanisms for content that may be considered illegal or in violation of their Terms and Conditions. However, some regulators feel these mechanisms may not meet the required standards due to poor design choices.
According to John Evans, Coimisiún na Meán's DSA Commissioner, "The core principle of the DSA is the right of people to report content that they suspect to be illegal," he said. "Providers are obliged to make their reporting mechanisms easy to access and user-friendly." However, a significant concern arises when providers manipulate or mislead users in any way.
This investigation comes as part of a broader effort by Ireland's regulators to ensure platforms comply with the DSA and GDPR regulations. Coimisiún na Meán has already compelled other platform providers to make "significant changes" to their reporting mechanisms for illegal content following similar probes. These entities may face substantial fines, with penalties ranging up to six percent of their revenue if found in non-compliance.
Meanwhile, another investigation is underway into X (the social media company behind the popular Grok AI assistant), with concerns raised about training this AI on user posts violating GDPR rules and potentially resulting in Ireland taking a four percent cut of X's global revenue.
A probe has been initiated by Ireland's media regulator, Coimisiún na Meán, into two prominent social media platforms – TikTok and LinkedIn – over concerns they may be violating the European Union's Digital Services Act (DSA). The focus of this investigation centers on how these platforms present and implement their reporting tools for illegal content, with regulators finding "deceptive interface designs" that could render them less effective in removing illicit material.
According to the regulator, there are issues with the design and presentation of these reporting mechanisms. Specifically, it was found that users may be misled into believing they're reporting content as being against its terms when, in fact, it's simply a matter of violating the provider's conditions. This lack of clarity raises concerns about how effectively these platforms can handle problematic content.
The DSA emphasizes user rights and requires providers to have easily accessible and user-friendly reporting mechanisms for content that may be considered illegal or in violation of their Terms and Conditions. However, some regulators feel these mechanisms may not meet the required standards due to poor design choices.
According to John Evans, Coimisiún na Meán's DSA Commissioner, "The core principle of the DSA is the right of people to report content that they suspect to be illegal," he said. "Providers are obliged to make their reporting mechanisms easy to access and user-friendly." However, a significant concern arises when providers manipulate or mislead users in any way.
This investigation comes as part of a broader effort by Ireland's regulators to ensure platforms comply with the DSA and GDPR regulations. Coimisiún na Meán has already compelled other platform providers to make "significant changes" to their reporting mechanisms for illegal content following similar probes. These entities may face substantial fines, with penalties ranging up to six percent of their revenue if found in non-compliance.
Meanwhile, another investigation is underway into X (the social media company behind the popular Grok AI assistant), with concerns raised about training this AI on user posts violating GDPR rules and potentially resulting in Ireland taking a four percent cut of X's global revenue.