The company owned by Elon Musk – X (previously, Twitter) – has gotten into a rather awkward position. Think about the situation where one day, you realize that the information you assumed was securely stored was, in fact, being input into an AI model without your consent. It must sound like the plot of a science fiction movie, does it not? Well, this is real life, and this is happening to millions of users across Europe.
What Occurred?
More recently, it was the European Center for Digital Rights, or Noyb (“none of your business”). They also claimed that the company abused the personal data of over 60 million European users to train its Grok AI tech. And here’s the kicker: The users did not have any knowledge that this was happening. No prior warning, no “Excuse me sir/ma’am, do you mind if we use your data for a few hours?” None of that. No one knew of thisuntil a viral post appeared on social media in July..
The Law Isn’t Amused
Now, you may be asking yourself, ‘can they do that?’ And under Europe’s laws, the answer is a resounding no, thanks to the General Data Protection Regulation (GDPR). The law is clear: this means that if a company wants to use your personal information, they have to seek your permission first.
Thus, when Noyb discovered what X was doing, they were not particularly happy about it. They’ve been pretty vocal about it, filing complaints in eight countries: Austria, Belgium, France, Greece, Ireland, Italy, the Netherlands and Spain. Yes, it is not just one country that is angry, it is a bunch of them!
DPC’s Response
After these complaints came to light, Ireland’s Data Protection Commission (DPC) did take some action. They were able to persuade X to temporarily stop data processing for AI training, which is progress. However, according to Noyb’s founder, Max Schrems, this is just a mild way of penalizing. He is calling for an investigation as extensive as possible in order to get to the root of the issue.
Schrems has a point. If a company can blurt out “Oops, sorry!” once they have violated a rule, then why can they not do it again?This is not the first time Noyb targets a tech giant, however. They have already clashed with Meta and this saw Meta reign in its AI ambitions. Thus, Schrems and his team are familiar with legal battles.
What’s Next?
As the drama unfolds, one thing is clear: This is far from over. Noyb’s case is still in court and the decision made could affect the way organizations deal with user data in the future. Should X be found guilty of having violated the law, it will have to pay some impressive fines and alter its business model.
For now, users in Europe (and beyond) are left wondering what else might be happening behind the scenes with their personal data. This serves as a wake-up call to everyone to be watchful and concerned with how our data is being managed. Indeed, privacy is a valuable asset especially in this era of artificial intelligence and the huge databases.