this post was submitted on 31 Mar 2026
368 points (99.7% liked)

Technology

83251 readers
3900 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] Encephalotrocity@feddit.online 15 points 21 hours ago (3 children)

If it was the law then the AI itself would be coded to not allow going "undercover", and there would be legal consequences if caught. Torvald's stance only matters for how things 'are' not how they 'could be'.

Would it be a cure all? Of course not. Fraud still happens despite the illegality. But it's better than not being able to trust anything ever again.

[–] cecilkorik@piefed.ca 10 points 17 hours ago (2 children)

I hate to break it to you, but we're never going to be able to trust anything ever again. At least, not the way we used to. In the future, without any doubt, we are going to need to develop a different model of learning, using, and processing information that considers the provenance of where the information came from and how it got there from essentially first principles. We will have to build a web of investigation and trust to determine and mark what information is trustworthy and what is not, especially new information. None of this exists in any meaningful way yet, and the systems we used to have for it, like academic research and journalism for example, would have been catastrophically inadequate to handle this onslaught even at their peak, and they are nowhere near their peak anymore, having been deliberately eroded into a shadow of their former effectiveness so some assholes could get rich and powerful. So hopefully we'll be able to rely on solid ground like Wikipedia and... books as a starting point, and nobody gets around to burning the Library of Alexandria down in their rage against "woke stuff", because otherwise we're going to be rebuilding our information spaces pretty much from scratch in the near future, probably at the same time we're rebuilding civilized society in general. If this sounds incredibly uncertain, tedious and painful: yes, it will be, especially at first. But we will get better at it, eventually. We will develop new systems for it, we will become fluent in information again and the friction will fade.

I wish we could get to that stage right away, but unfortunately it will have to wait. We can't do anything to improve the swimming pool while we are currently drowning in it. This is the reality that rampant and unchecked use of AI technologies by soulless corporations and corrupt governments have wrought. Logic and reason never stood a chance, and we are entering the digital dark ages. The enlightenment is probably coming someday, but don't hold your breath for it.

Support your local library, that's the most helpful thing I can think of for individuals to do. Librarians know their shit.

[–] bss03 8 points 15 hours ago (1 children)

we are going to need to develop a different model of learning, using, and processing information that considers the provenance of where the information came from and how it got there

They used to teach this in schools under "critical thinking skills". Following the chain of sources to the primary sources was a task I had to to (at least in part) more than once in secondary school.

Authoritarians don't like that tho.

[–] cecilkorik@piefed.ca 1 points 3 hours ago

Absolutely, just like addiction to fast food causes obesity, our addiction to fast information has developed into a profound societal ignorance. Studying issues seriously takes time and effort, and if you think "ain't nobody got time for that" I'll tell you right now you're going to have to start to make time for it. Because if you don't, you'll end up knowing nothing, and being wrong about everything, and while that may be acceptable to anyone following all the other lemmings in the same direction (the double irony of "lemming behavior" being historical fake information itself, while posting this on lemmy is not lost on me), I'm also going to suggest to you there will be serious personal consequences from being wrong all the time, and those consequences are going to catch up with you sooner or later.

[–] njordomir@lemmy.world 2 points 14 hours ago

I agree. I've thought a lot about how valuable signing a simple message with a key can be. In an age where machines can appropriate your likeness, how do you accumulate and shed reputation, how do you prove it was you? One low tech version was taking a photo with a newspaper to prove you are a real person. Another is exchanging a public key with a person in real life so you can have reasonable certainty that communications signed with that key are legit. Since this boils down to denying what our eyes have seen, governments and businesses who are very keen on control reality are making their plays. Even identifying yourself cryptographically is only a temporary fix to maintain an existing identity. Your kids will be profiled and mimicked from day one. This whole slippery slope we've been sliding down lately seems very foreseen. It feels like these traps were engineered a very long time ago.

I think education is the absolute most important thing for a functioning post-truth society. Kids need to smell shit from 20 miles away because the world is full of traps for your mind same as it is for your wallet and your physical body. We also need to be able to verify and trust our tech stack. We need to pass down the stories of the times common people lost and the times common people won. We need to read and discuss philosophy. We'll also have to tackle American religion head on. Also excessively addictive entertainment designs. We are a deeply flawed society and I'm not sure where we should start except for taking some of our time back so people actually have the opportunity to think about these things.

[–] DireTech@sh.itjust.works 2 points 19 hours ago

Can they even code them to do that? They’ve struggled so much with the em-dash and never managed to block Disneys characters so I figure they can’t do it 100% of the time even if they want to.

[–] MangoCats@feddit.it -5 points 18 hours ago

and there would be legal consequences if caught.

Like for driving over the speed limit? Or putting glass in the regular trash instead of the recycling? Yeah, just what I need in my life, another arbitrary law that's enforced 0.0001% of the time as a flex by the people in power to target and abuse people they don't like.