deleted by creator
Common Noyb W
https://en.wikipedia.org/wiki/NOYB in case anyone else was wondering.
I’d never heard of them before but it looks like they have a pretty good history of holding corps accountable.
Under Article 80, the GDPR foresees that non-profit organizations can take action or represent users.
Emphasis mine. Just in case anyone is still denying that Euro-English is a thing. It provides for it, which implies it anticipats it, so if we get rid of unnecessary Romance it foresees it. Plans for, as Wikipedia puts it, but not only the French etymology (prevoir) works, there’s also German vorhersehen/vorsehen which collapse when you calque them into English. Gives Brits all kinds of headaches, they were writing lots of huffy memos before Brexit.
I mean, there’s Scottish English (inb4 someone tells me about Glaswegian being different than $idk), American English etc, so Euro-English might be a tad broad but I’m sure it exists.
And then there’s German English which a lot of my older coworkers seem to excel at… “Ja, also, as you see, ve have here not a problem anymore, ja, and ze dropped calls are removed from this node, okeh?” (real quote from a regional manager)
Svengelska is likewise a thing here in Sweden
Max Schrems and his team have done a lot of good regarding user rights in the face of giants like meta. I’m pretty sure that at least a handful of Meta employees regulary have nightmares because of NOYB.
They are responsible for getting both “data protection adequacy agreements” for the US thrown out in court (see Max Schrems).
Unsure what NOYB is, even after skimming this, but an interesting bit in there about how people wouldn’t have the right to be forgotten once the AI has been trained.
a quick overview of the legal battles of Max Schrems / NOYB: https://en.wikipedia.org/wiki/Max_Schrems
I think there’s some „reasonable” keyword in the right to be forgotten. Like first if you have some old backups on tapes and you must keep them for whatever reason still for few years m, you can deny altering them if it the cost would be exorbitant and you ensure the users won’t come back after a recovery from said backup.
Also they might train their models on pseudo-anonymized dataset so as long it’s too expensive to deanonymize the user data it could be fine in terms of GDPR.
For example: you generate car trips stats per city in a country, per day. You could argue that you don’t need to delete user data that is part of this set if you ensure there are always enough of trips recorded (so can’t deanonymise someone from a single entry) and also it would falsify your historical stats.
At my company who likes to be super compliant we do remove people from this kind of stats using some pseudo-anonymous references. So if you remove your account, there’s an event that changes the historical analytics data and removes all traces of your activity. But that’s because we can and want to be cool (company culture principles).
Other data we have (website analytics) are impossible to go into this process as we ensure we never know WHO did something. We only know what and when.
I think there’s some „reasonable” keyword in the right to be forgotten.
The original case was a Spanish cook being haunted by the first google result for his name being an article in a local newspaper about his restaurant going bankrupt decades ago. No scandal or such, just an ordinary bankruptcy, but he could demonstrate that it was impacting his current business.
He sued, and google had to remove the thing. Not when you search for his name and bankruptcy, not when you search for “what happened to that restaurant”, and the newspaper itself also didn’t have to do anything. As far as I know you can still find the article.
If you’re a journalist writing the guy’s biography, you’ll find it, push come to shove in some offline archive. But random people won’t see him nailed to a virtual pillory, that’s what all this is about.
I don’t think it’s really an issue for AI, but it has to be engineered in. Ultimately it’s about judging relevancy.
Does anyone onow if there is a NOYB equivalent in North America?
The EFF maybe, but the USAs lack of a GDPR equivalent makes it harder.
CA has some strong privacy protections and a good chunk of the country’s population. IANAL but if I were to hope for a similar lawsuit it would come from CA state court.
Europe’s gonna cut itself off from AI and miss this tech boom. At least they still have internal combustion cars, until China eats their lunch.
They break their own procurement laws to pick MSFT as well, they dont even abide by their own bureaucracy.
We don’t want US shit. Simple as.
That’s a retarded position to take.
No thanks, the US can keep their spam machines.
thanks for calling out this stupidity.
it’s so hard to find legitimate trolls to block.
Not all AI is equal. Europe does embrace certain types of AI depending on their production and usage. I work at a company pushing our AI throughout Europe, and the reception is generally very positive.
These LLMs are just shit built in shitty ways. Their problem definition is shit, and the marketing of what they can do effectively is bullshit. There are some LLM efforts that are less shitty, but they’re not very popular yet
Meta’s recent LLAMA models are a disaster and worse they only masquerade as open models. Meanwhile Europe has it’s own AI research centers like Mistral who make really good models under the Apache 2 license.
How much months was this again, since they would have needed multiple times the internet in data amounts, just to have some progress? Sorry, but it’s a bubble.
Edit: AI being LLM, like @TheBeege said.
Surely there is a middle ground from bending over for technogarchy and not having as wealthy an economy?