this post was submitted on 26 Mar 2026
109 points (81.1% liked)

Technology

83251 readers
3310 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] KoboldCoterie@pawb.social 148 points 5 days ago (9 children)

Here’s a thought experiment: imagine Instagram, but every single post is a video of paint drying. Same infinite scroll. Same autoplay. Same algorithmic recommendations. Same notification systems. Is anyone addicted? Is anyone harmed? Is anyone suing?

Of course not. Because infinite scroll is not inherently harmful. Autoplay is not inherently harmful. Algorithmic recommendations are not inherently harmful. These features only matter because of the content they deliver. The “addictive design” does nothing without the underlying user-generated content that makes people want to keep scrolling.

This feels like an awful argument to make. It's not the presence of those things that make Meta and co so shit, it's the fact that they provably understood the risks and the effects that their design was having, knew that it was harming people, and continued to do it anyway. I don't care if we're talking about a little forum run by a Grandma and Grandpa talking about their jam recipes; if they know that they're causing harm and don't change their behavior, they should be liable.

[–] HeartyOfGlass@piefed.social 48 points 5 days ago (3 children)

"We designed, marketed, and sold the gun, but we didn't think anyone would use it."

[–] KoboldCoterie@pawb.social 21 points 5 days ago

It's like if someone had a forum where insurrectionists were discussing how to build bombs and where they were going to use them, and the owners had an internal meeting where they said, "Hey, we're hosting some pretty awful people, should we maybe report them or shut this down?" and the answer was, "Nah, they're paying users, and we want their money."

Pretty sure Section 230 wouldn't protect them, either.

[–] fartographer@lemmy.world 1 points 3 days ago* (last edited 3 days ago)

Now now now, ladies and gentlemen, I'm just a simple country lawyer, and I sure love me some mashed potatoes. I love mashed potatoes; I eat them every day. I love mashed potatoes so much that, hell, I'll have them with anything. I also love my gun, but I wouldn't eat my gum! Hold for laughter Now what if I had mashed potatoes with my gun? Not like picks up revolver from displayed evidence and pantomimes using it as a fork, putting the barrel all up in his mouth. Jury roars with laughter. No. Imagine that I'm stuffing my mashed potatoes into this gun! There's mashed potatoes in the barrel, mashed potatoes in the chambers, mashed potatoes gunking up the cylinder and hammer... Do you think this gun will fire? Of course not! I could point my mashed potato gun at anyone in this court muzzle sweeps the jury, and no one would even flinch. How could something that can be defeated by MASHED POTATOES be dangerous? Hell, how could a person holding such an impotent device have any sense of danger? Have you ever killed anybody with mashed potatoes? Have YOU?? We all know that opposing counsel's argument that my client "intentionally shot, at point blank" my client's own best friend. A best friend is someone you eat mashed potatoes with! Not murder and then "steal" their suspiciously unopened Star Wars memorabilia... This is why you need to return a verdict of "guilty" and award my client $50 million from the so-called "victim's" family for psychological and emotional damages, as well as the cost of selflessly grinding up and eating his best friend's body to save the family funeral costs. The prosecution rests.

[–] XLE@piefed.social 17 points 5 days ago

It's like he's describing a slot machine with unpainted wheels, leaving out the context that it's in a casino with a big "paint me and enjoy a share of the profit" sign above it.

The social media machine was designed to be a self-serve addiction generator. It intentionally used every trick it could legally get away with.

[–] atrielienz@lemmy.world 7 points 5 days ago

The harm doesn't come from the aspects of infinite scroll, auto play, or algorithmic examples in a vacuum.

But we have statistically proven that when you gamify the system and the content can be considered harmful to consume too much of, those two factors are what makes it dangerous.

Tricking the brain into doing something harmful to itself by gamification is the problem. The algorithm, auto play and infinite scroll are just mechanisms to facilitate that. Novelty only plays a small part in that. The algorithm by itself doesn't provide a dopamine hit. The infinite scroll by itself doesn't provide a dopamine hit. The auto play feature by itself doesn't cause a dopamine hit.

Even when you combine all three the dopamine hit won't come if the content being pushed isn't sufficient to cause a rush of dopamine. And that dopamine rush often comes from things like upvotes and downvotes, and badges, and achievements. Follower counts and other metrics that the individual users use to get dopamine are being weaponized against them to make money. And it was intentional on the part of meta execs.

[–] avidamoeba@lemmy.ca 8 points 5 days ago

Also they can now generate content without users, which they already do a lot on Facebook.

load more comments (5 replies)
[–] freeman@sh.itjust.works 12 points 4 days ago (1 children)

One of the key pieces of evidence the New Mexico attorney general used against Meta was the company’s 2023 decision to add end-to-end encryption to Facebook Messenger. The argument went like this: predators used Messenger to groom minors and exchange child sexual abuse material. By encrypting those messages, Meta made it harder for law enforcement to access evidence of those crimes. Therefore, the encryption was a design choice that enabled harm.

The state is now seeking court-mandated changes including “protecting minors from encrypted communications that shield bad actors.”

I don't see any of the people celebrating this decision discussing this? Perhaps it's a misrepresentation by the author since I can't find the actual decision text.

This is going to harm small non-corporate websites, not just social media, far more than Facebook or Tiktok. Harmful content is also going to include stuff like LGBTQ, especially anything trans related, and 'antisemitism' (but probably not antisemitism.)

[–] zerofk@lemmy.zip 55 points 5 days ago* (last edited 5 days ago) (2 children)

This distinction — between “design” and “content” — sounds reasonable for about three seconds. Then you realize it falls apart completely.

Bull fucking shit. This is not about platforms being held responsible for user content. This is about adding points and badges and achievements and all kinds of things designed to reward engagement with dopamine.

The author’s example of all content being drying paint would absolutely be addictive if the platform added an achievement for watching 10 different colours. Or: Congratulations, you’ve watched paint dry for 100 hours! As a reward, you get a new fancy emote! THAT is what these platforms do, and that is what is addictive. And that is what they’ve been convicted for.

Is not a loophole to get around section 230 as the author claims.

[–] Baylahoo@sh.itjust.works 3 points 4 days ago (1 children)

I'm not disagreeing with you when I say this; I just am not on social media other than lemmy and YouTube at this point so, I am out of the loop. What are these sites doing that gamifies watching content? I get all the other crap for posting content like likes and views. It incentivises content producers. How are viewers getting "likes and views" equivalent on Facebook?

[–] SaltySalamander@fedia.io 1 points 4 days ago

What are these sites doing that gamifies watching content?

adding points and badges and achievements and all kinds of things designed to reward engagement with dopamine

[–] nickiwest@lemmy.world 5 points 5 days ago

Let's not forget the years of literal psychological experiments that Meta conducted on its users to find out exactly what factors led to higher engagement.

This isn't a simple message board. This is a highly-engineered, personalized content delivery system with the goal of serving as many ads as possible.

[–] sundray@lemmus.org 50 points 5 days ago (1 children)

Surprise surprise. If you go through Techdirt's archives, you can see Mike Masnick has spent thousands of words losing his shit any and every time Facebook has faced ANY criticism. I don't know if he has a financial interest in them (like he does with Bluesky) but the moment someone suggests reining them in, here comes Masnick to defend one of the richest, most lawyer-ed up companies around.

[–] XLE@piefed.social 23 points 5 days ago (1 children)

Mike Masnick is on the Bluesky board of directors. Could this position be affecting his judgment on this specifically? because usually I expect Techdirt and Mike himself to be much more reasonable.

[–] General_Effort@lemmy.world 9 points 5 days ago (1 children)

Yes, of course. Bluesky is also social media and so the precedent set by these cases will apply to it. Besides, knowledge of a subject does tend to affect your judgment.

[–] SaltySalamander@fedia.io 2 points 5 days ago (2 children)

Bluesky is also social media

So is Lemmy...

[–] NarrativeBear@lemmy.world 2 points 5 days ago (2 children)

I was wondering the other day if Lemmy or Bluesky have any algorithms that are actively trying to keep users engaged?

[–] jackal@feddit.uk 2 points 4 days ago

Technically it does, it does use ranking algorithms, but they are for sorting and surfacing content rather than a modern “engagement optimization” system like a recommendation feed designed to maximize time spent.

[–] zalgotext@sh.itjust.works 6 points 5 days ago (1 children)

Cool thing about Lemmy is you can just read the code and find out

[–] JayGray91@piefed.social 1 points 4 days ago (1 children)

IIRC somewhere they also explained it in plain English what the sorting methods do. My layman brain thinks that's a kind of algorithm.

Kindly correct this layman if I'm misunderstanding :)

[–] Baylahoo@sh.itjust.works 1 points 4 days ago

I'm also a layman, but I have read some discussions about this exact comparison. Essentially, the big mainstream sites often have personalized algorithms for each user that learns and adapts specifically to the user to feed the user whatever junk food content it can to keep them engaged. Algorithms on things like base lemmy or maybe reddit in the past just have a sort function like excel that propped up posts with more likes or more comments. You can see what other people are interested in, but it's not targeting YOU. The predatory targeting algorithm can put a person into a self fulfilling echo chamber that in some ways resembles psychosis. This could naturally evolve into actual psychosis for individuals. I think the old verbage of "touch grass" was the prescription for fighting the effects. I think it's a lot harder to "touch grass" when people are increasingly online and have fewer and fewer avenues to get out of their own echo chamber while staying online almost exclusively. I'm not an expert and the people I got this info from have no credentials I can source, but the logic seems sound to me. Anyone else with better credentials should weigh in if I'm wrong.

The Internet went from globalizing us to partitioning us pretty suddenly, and I think we are seeing the effects now.

[–] General_Effort@lemmy.world 1 points 5 days ago

Yes, but everyone on Lemmy knows that the law only applies to the bad guys.

[–] dhork@lemmy.world 25 points 5 days ago

Normally, I am all for Techdirt's takes. But I think this one is off the mark a bit, because I legitimately think that infinite scroll and auto play are insidious, and actually harmful enough to be treated as a dangerous design decision.

The whole point of Section 230 is that communications companies can't be held responsible for harmful things that people transmit on their networks, because it's the people transmitting those harmful things that are actually at fault. And that would be reasonable in the initial stages of the Internet, when people posted on bulletin boards (or even early social media) and the harmful content had a much smaller reach. People had to "opt in", essentially, to be exposed to this content, and if they stumble on something they find objectionable they can easily change their focus

But the purpose of the infinite scroll and auto play is to get people hooked on content. The algorithms exist to maximize engagement, regardless of the value of that engagement. I think the comparison to cigarettes is particularly apt. They are looking to hook people into actively harmful behaviors, for profit. And the algorithms don't really differentiate between good engagement and harmful engagement. Anything that attracts the users attention is fair game.

The author's points regarding how these rulings can be abused are correct, but that doesn't negate how fundamentally harmful these addictive practices are. It will be up to lawmakers to make sure that the laws are drafted in such a way that they can be applied equitably.... (So maybe we're screwed after all....)

[–] knobbysideup@sh.itjust.works 14 points 5 days ago (15 children)

"For the children" tech laws should all be abolished. Why should I be burdened because you can't be bothered to raise your own damned kids properly?

load more comments (15 replies)
[–] teyrnon@sh.itjust.works 13 points 5 days ago

In truth this is part and parcel of age controls as an excuse to id everyone.

[–] DFX4509B@lemmy.wtf 10 points 5 days ago (1 children)

This is probably an extreme take, but kids shouldn't be anywhere near a tablet while they're still really young especially.

[–] can_you_change_your_username@fedia.io 1 points 5 days ago (1 children)

It kind of a tough balance. Yes, unrestricted tech use is an issue for young children but on the other hand using tech while young is the best way to make it a natural part of your experience of the world and tech isn't going away. If you go to the other extreme and say no whatever sort of tech period before a certain age are you setting the kid back against more tech literate peers? There's also the consideration that's been discussed around alcohol forever. By making it an "adult thing" and effectively a rite of passage to drink alcohol do you cause more problems and abuse in young adults than if it was always a part of their experience and the focus was on responsible use instead of total abstinence?

[–] Baylahoo@sh.itjust.works 1 points 4 days ago

I completely agree. It would be amazing if we could nationally or even globally enforce age restrictions to give an internet kiddy pool to let children learn and grow in a safe online environment. We live in a time where the people who are pushing this in the government should not be trusted to use that information for the real reason. "For the kids" is all made up and not helping kids. Giving up privacy in order to not help kids really highlights how corrupt the people pushing for this are.

[–] Corkyskog@sh.itjust.works 4 points 5 days ago* (last edited 5 days ago)

The author reads like he doesn't understand context or the legal idea of a rational actor. What users are going to purposefully upload boring content?

Here’s a thought experiment: imagine Instagram, but every single post is a video of paint drying. Same infinite scroll. Same autoplay. Same algorithmic recommendations. Same notification systems. Is anyone addicted? Is anyone harmed? Is anyone suing?

Of course not. Because infinite scroll is not inherently harmful. Autoplay is not inherently harmful. Algorithmic recommendations are not inherently harmful. These features only matter because of the content they deliver. The “addictive design” does nothing without the underlying user-generated content that makes people want to keep scrolling.

[–] TemplaerDude@sh.itjust.works 2 points 5 days ago

This guy has an addiction lol ironic

[–] Lumisal@lemmy.world -3 points 4 days ago* (last edited 4 days ago)

Local wannabe crack dealer Mike Masnick says crack isn't harmful, life without it would be boring. More at 11