Ireland is investigating TikTok and LinkedIn for possible DSA violations

Ireland's Regulator Launches Investigations Into TikTok and LinkedIn Over DSA Violations Concerns

A probe has been initiated by Ireland's media regulator, Coimisiún na Meán, into two prominent social media platforms – TikTok and LinkedIn – over concerns they may be violating the European Union's Digital Services Act (DSA). The focus of this investigation centers on how these platforms present and implement their reporting tools for illegal content, with regulators finding "deceptive interface designs" that could render them less effective in removing illicit material.

According to the regulator, there are issues with the design and presentation of these reporting mechanisms. Specifically, it was found that users may be misled into believing they're reporting content as being against its terms when, in fact, it's simply a matter of violating the provider's conditions. This lack of clarity raises concerns about how effectively these platforms can handle problematic content.

The DSA emphasizes user rights and requires providers to have easily accessible and user-friendly reporting mechanisms for content that may be considered illegal or in violation of their Terms and Conditions. However, some regulators feel these mechanisms may not meet the required standards due to poor design choices.

According to John Evans, Coimisiún na Meán's DSA Commissioner, "The core principle of the DSA is the right of people to report content that they suspect to be illegal," he said. "Providers are obliged to make their reporting mechanisms easy to access and user-friendly." However, a significant concern arises when providers manipulate or mislead users in any way.

This investigation comes as part of a broader effort by Ireland's regulators to ensure platforms comply with the DSA and GDPR regulations. Coimisiún na Meán has already compelled other platform providers to make "significant changes" to their reporting mechanisms for illegal content following similar probes. These entities may face substantial fines, with penalties ranging up to six percent of their revenue if found in non-compliance.

Meanwhile, another investigation is underway into X (the social media company behind the popular Grok AI assistant), with concerns raised about training this AI on user posts violating GDPR rules and potentially resulting in Ireland taking a four percent cut of X's global revenue.
 
I remember back in my day when we just wanted to share our funny cat videos without worrying about getting fined 🤣. Now it seems like all these big tech companies have to follow so many rules, and they're always trying to find ways to wiggle out of it 🙄. It's like, come on guys, if you want us to report your bad stuff, make sure the buttons are clear and easy to press, not some confusing mess 😒.

And don't even get me started on AI and how it's all supposed to be regulated 🤖. I mean, I'm no expert, but I just want my Grok AI assistant to understand what I mean without having to teach it the entire history of human language 💻.

But you know what? I'm not entirely surprised by this new investigation 👀. The EU is trying to keep these big tech companies in check, and if they're not careful, they might end up with a fat fine 💸. Better safe than sorry, right? 🤔
 
Wow 😮🤔 - I mean, come on, TikTok and LinkedIn can't even get their reporting tools right? It's not that hard to make them clear and easy to use. This DSA stuff is all about protecting users' rights, so it's pretty interesting that these platforms are having trouble following the rules. And now they're gonna get fined if they don't shape up... six percent of revenue is a lot 💸👀.
 
I think its crazy that some platforms arent takin this DSA thing seriously enough 🤯, especially when it comes to reportin illegal content. If its as simple as a misleading interface design, thats just red flag for me 👎. We gotta make sure these big players do whats right for users, you feel? 🙏 And on another note, Ireland is killin it with their regulatory efforts 💪, keep pushin those platforms to step up their game! 😊
 
🤦‍♂️ Like, what's next? Are we gonna probe Snapchat for not having a 'report suspicious squirrel' feature 🐿️? But seriously, it's pretty weird that TikTok & LinkedIn are getting called out for their reporting tools being "deceptive interface designs". Can't they just make them more user-friendly or something? Like, I get it, the DSA wants to protect users, but come on, it's not like these platforms are intentionally trying to deceive people... unless you're a cat who gets too many selfies posted online 😸.
 
I'm like totally worried about these platforms 🤔. If they can't even be bothered to make their reporting tools clear, how are we supposed to trust them with our personal info? It's like, I get it, mistakes happen, but come on! The EU is trying to protect us from all this junk online and these companies are just playing the system 🙄. I mean, have you tried reporting something weird on TikTok or LinkedIn? Good luck with that 😂. They need to step up their game and make sure we can trust them. If not, they'll be stuck with a fat fine 💸.
 
I'm thinking, these social media giants are so used to getting away with stuff 🤑... think they can just do whatever they want and no one will call them out. But seriously though, it's about time someone took a closer look at how they handle reporting tools. I mean, who wants to be misled into reporting something as a violation when really it's just not cool with the platform? It's all about transparency and user experience, you know? 🤔 The DSA is trying to protect people from all this manipulation nonsense. And let's not forget, these companies are making a ton of money off our data... it's only fair they follow some basic guidelines.
 
So I'm thinking... these DSA violations are super concerning, you know? Like, the whole point of having reporting mechanisms is to allow users to report content that's against the terms of service. But if the design is all misleading and unclear, how can we really trust that it's working properly?

I mean, imagine using a platform where you're like "oh, I'm reporting this suspicious post" only to find out later that it was just not following the rules of engagement or whatever. That's basically useless, right? And on top of that, there's the whole issue of users being misled into thinking they're doing the right thing.

It's like, the DSA is all about protecting user rights and ensuring these platforms are responsible for removing problematic content. But if the reporting mechanisms aren't up to par, how can we expect that to happen?

And let's not forget about the fines – six percent of revenue is a huge chunk of change. If companies can't get it together on this stuff, they're gonna be hit hard.

This investigation in Ireland is like a big deal because it's part of their effort to make sure platforms are complying with the DSA and GDPR regulations. And if other platforms follow suit and make changes to their reporting mechanisms, that'll be even better.

But what really gets me is the fact that there's another investigation going on into X (the social media company behind Grok AI) because they're allegedly training their AI on user posts violating GDPR rules... that's some sketchy stuff. 🤔
 
🤔 I'm all for holding these big platforms accountable, but can we not make this more accessible to regular users? Like, I get that the DSA is supposed to protect us from bad content, but how are we s'posed to report it if we're just gonna be confused by the design?

And btw, TikTok and LinkedIn have some serious explaining to do 📊. Those "deceptive interface designs" sound super dodgy. It's like they're trying to trick users into reporting legit stuff instead of actual hate speech or whatever.

We need to make sure these platforms are transparent about their reporting mechanisms and that we can trust the algorithms, ya know? Otherwise, it's just a bunch of hoops for us to jump through without actually getting rid of the bad stuff.

What do you guys think? Should we be more strict on these platforms or is this just another example of them trying to play both sides 🤷‍♀️
 
Back
Top