this post was submitted on 25 Mar 2026
1167 points (99.1% liked)

Technology

83295 readers
5343 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] Quexotic 4 points 6 days ago

If corporations are people, then why can't Facebook go to jail?

Money.

I predict this will be tied up in appeals until the day SCOTUS or the executive sniff these suits out.

[–] Fredselfish@lemmy.world 173 points 1 week ago (9 children)

So...it's a fucking fine, which way less then he made by doing this. Until throw these fucks in jail this shit will continue.

[–] staircase@programming.dev 59 points 1 week ago* (last edited 1 week ago) (2 children)

In the next phase of the legal proceedings, due to begin on 4 May, the attorney general’s office will seek additional financial penalties and court-mandated changes to Meta’s platforms that “offer stronger protections for children”, said Torrez.

The design feature changes the state is seeking include “enacting effective age verification, removing predators from the platform, and protecting minors from encrypted communications that shield bad actors”.

Unclear how age verification would play out with their Digital Childhood Alliance efforts.

[–] Fredselfish@lemmy.world 62 points 1 week ago

I promise you whatever happens it won't be good for the rest of us.

[–] fluffykittycat@slrpnk.net 19 points 1 week ago (6 children)

And that shit is why I'm hesitant to endorse these big tech lawsuits

load more comments (6 replies)
[–] Lost_My_Mind@lemmy.world 31 points 1 week ago (4 children)

Until throw these fucks in jail this shit will continue.

Which is exactly why that won't happen. Our president is a pedophile. There's a whole network of wealthy pedophiles who no longer have an island. The pedophiles are in power.

[–] OwOarchist@pawb.social 20 points 1 week ago (2 children)

who no longer have an island

*who now have a different island that we don't yet know about.

load more comments (2 replies)
load more comments (3 replies)
[–] randompasta@lemmy.today 16 points 1 week ago (1 children)

Fine based on % income of the company.

load more comments (1 replies)
load more comments (6 replies)
[–] deathbird@mander.xyz 84 points 1 week ago (16 children)

"The design feature changes the state is seeking include “enacting effective age verification, removing predators from the platform, and protecting minors from encrypted communications that shield bad actors”."

Oh fuck right off.

I'm sorry but this is a bad "think of the children" decision. There are limits to what Meta or any platform can do about bad actors at that size without structural changes.

What might actually help: only show people content from groups and people that they follow, preferably in chronological order, rather than suggesting new groups and pages algorithmically all the time and thereby increasing the likelihood of children interacting with strangers on the Internet.

And improve parental controls for children's accounts. I'm sure there's nothing currently giving a "parent" account high level control over a "child" account, but I'm happy to be corrected if I'm wrong.

But also: require intercompatibility with other platforms and a standardized form of profile data export so people can leave Facebook but stay in touch with the people who still use it.

[–] MyMindIsLikeAnOcean@piefed.world 0 points 6 days ago (1 children)

Dude…installing Facebook Purity doesn’t protect you from child predators, what are you even talking about?

Want to know how all social media could simply and easily protect minors, and everybody at large? Hire some fucking moderators. Every social media company should be required to use as many humans as it takes to moderate all content posted on their platforms…everybody problem would be reduced to near zero. What’s happening now is nobody works at META…except at the design, legal and coding level. If you’re a bad actor and you want to post…you use a bot to interact with an automated process, and you’re always one step ahead of the automated process.

[–] deathbird@mander.xyz 1 points 5 days ago (1 children)

This is a solvable problem though. FB could create tools to allow their users to cultivate a better experience, including but not limited to parents and children. It wouldn't require a war of attrition against automation, or infinite moderators, but allowing people to have deeper control of their experience would reduce the number of ads you could shove in their faces and the amount of profit you could make. They therefore won't do it voluntarily, and that's why they should be compelled to provide such functionality by law.

[–] MyMindIsLikeAnOcean@piefed.world -1 points 4 days ago (1 children)

They should be compelled to…sell less ads? Silly. What do you mean by “tools”? There a gajillion tools that nobody understands or uses…we need more responsibility in the purveyor…not the user. Saying you want tools is the status quo.

Moderation is the only solution. Social media companies should be required, with no exceptions, to follow the laws of the region they operate in. They don’t do that…they put out whatever whenever and take almost no responsibility for what they expose people to.

[–] deathbird@mander.xyz 0 points 4 days ago (1 children)

By "tools" I generally mean software and options/functionalities offered by that software through the regular user interface that enables one to modify the outputs of that software, and thus one's user experience. So in this sense Windows 11 is a "tool" that as an operating system enables one to use a computer, but also therefore supplies tools to modify the experience, such as one lets a privileged user prevent non-privileged users from uninstalling software or sharing a printer to the LAN, right? Facebook (a software deployment owned and remotely hosted by Meta) has a tool that allows a person with a Javascript-enabled web browser (also a tool) or Meta's proprietary application to send a message to a stranger on the internet, or a known person, along with a lot of other things, right?

Now what Windows 11 doesn't have is a tool that lets me locate my mouse pointer on screen easily, but that's okay because I can install PowerToys to gain that functionality. I can also install software that modifies the Facebook experience to some degree, but there's not a lot of that for various reasons, and certainly I can't find any that sells itself as a child-safety or parental control solution. But that makes sense, right? Because in order to serve that functionality it has to be deployable across all computers the child is using to access that remote service, and it has to be updated to match changes in that service's software, like your shadow is attached to your feet. No practical at all.

Obviously this is of limited use, and this is why people use tools to modify their experience of social media sites like FB are usually doing so merely for their own comfort and enjoyment, which is valid but not the same purpose as parental control. And the relationship between the remote service and the local software developer is adversarial. This is why there's plenty of parental control tools to block a website, but none to modify one.

I actually agree that moderation is the solution, but not in the way you mean. FB doesn't create content, it just facilitates people to share their own (bots too, but set that aside). I don't think any sane person believes that Meta or lemmy [dot] world or any other platform could continue to exist if it was held responsible for what its users said. Platforms make what efforts they do at regulation to avoid getting DMCAed, to keep themselves advertiser-friendly, and to make their services sufficiently enjoyable to users who those advertisers what their ads to be seen by. That last bit's important, but even look at the first two, a legal regulation and "regulation" by market forces in the wild, and you can see how these already cause problems. But what platforms like FB don't give you because they don't want you to have it is control over your user experience.

FB doesn't want you to have tools (account options) to moderate your own or your child's experience on their platform because it would cost them money, both in development costs and opportunity costs. But that's what's actually needed to make FB an enjoyable and even child-safe experience. Not broad legal "moderation" demands that no platform could survive without obscenely invasive company-side tools and exploitative labor outsourcing, but functional tools (that yes, would have to be mandated by law because they won't do it voluntarily) that enable the user to control their own experience. It's a question of, do you want some underpaid and thrice subcontracted Indian/Nigerian tech workers reading your teen's sexts with his boyfriend and making judgment calls as to their appropriateness, or do you want the capacity to simply allow communication between those to accounts without monitoring them, but retain the ability to block DMs from unknown accounts so your kid doesn't get groomed by a stranger? We're constantly told we have to choose between total system control or the Wild West, but we are only encouraged to consider these possibilities because they're what's cheapest for the companies.

[–] MyMindIsLikeAnOcean@piefed.world 0 points 1 day ago (1 children)

It just sounds like you’re a lawyer for Facebook. More of the same…more user-end “tools” that nobody uses and get abandoned, more harm to everyone. Down the well we go.

[–] deathbird@mander.xyz 0 points 1 day ago* (last edited 1 day ago) (1 children)

Would have been nice if you'd read what I write, but okay.

What Facebook wants is mandatory age checks at the OS level so they can just call an API and avoid all responsibility within their own platform.

What Facebook doesn't want is users being able to control their own experience of the platform.

[–] MyMindIsLikeAnOcean@piefed.world 0 points 20 hours ago* (last edited 20 hours ago) (1 children)

I can’t really say it in any more different ways. One last time.

Yes, of course Facebook wants to push unmoderated addictive content on all their users.

But yes, Facebook also loves putting out endless “user tools” so they can push the responsibility off of themselves for the same reason. These tools already exist. Tools are absolutely useless when you’re trying to protect at risk children or people in general…it’s like asking people to be their own doctor.

All social media needs to be regulated at a fundamental level, and that regulation must include each agent being responsible for the content their users post. Putting out more tools so users can block ads or control their kids will make things worse, as the companies continue the arms race for attention. The only people who benefit from tools are helicopter parents and the tech savvy.

[–] deathbird@mander.xyz 1 points 10 hours ago

No I get what you're saying, but your understanding of the world as it exists is incorrect, and your values are for oppression and anti-freedom.

Your incorrect understanding of reality: the on-platform tools that exist currently on Facebook are useless. You are powerless through account settings to limit your exposure to content from strangers on your feed, much less your child's, except by individually blocking accounts as you see them when logged into the account that you want to block from. Even Bluesky, which also has insufficient tools, is slightly better in this regard. But what few on-platform tools you're offerd only exist to give you the illusion of control over your experience. Greater control is possible but not offered because it's less profitable. It could be mandated through law.

Your anti-freedom values: making platforms responsible for user content will destroy them or force severe proactive censorship and real identity policies. None of that is conducive to a free and open society. The fediverse could not exist if servers could be held responsible for what users say or do. Most of the Internet couldn't exist if one rogue or politically unpopular user could land the service they use in court by offending another.

Your last paragraph is complete nonsense. The way to when an arms race is to come in with bigger arms. That's where the government comes in, not to force its own will but to restrain companies and empower people. The notion that giving people greater control of their experiences can harm them is insane.

[–] lmmarsano@group.lt 19 points 1 week ago* (last edited 1 week ago) (3 children)

And improve parental controls for children’s accounts. I’m sure there’s nothing currently giving a “parent” account high level control over a “child” account, but I’m happy to be corrected if I’m wrong.

Parental controls already exist in every major OS, they suffice to restrict & monitor social media, and they go unused.

A better solution might be for laws to provide parents resources & incentives to parent children's online activity (including training to use resources they already have) & to provide children education in online safety & literacy. Decades ago, federal courts citing commission findings & studies recommended these alternatives as superior in effectiveness, meeting government duties to minimize impact on civil liberties, allocation of law enforcement resources, etc. For the permanent injunction to COPA, the judge wrote

Moreover, defendant contends that: (1) filters currently exist and, thus, cannot be considered a less restrictive alternative to COPA; and that (2) the private use of filters cannot be deemed a less restrictive alternative to COPA because it is not an alternative which the government can implement. These contentions have been squarely rejected by the Supreme Court in ruling upon the efficacy of the 1999 preliminary injunction by this court. The Supreme Court wrote:

Congress undoubtedly may act to encourage the use of filters. We have held that Congress can give strong incentives to schools and libraries to use them. It could also take steps to promote their development by industry, and their use by parents. It is incorrect, for that reason, to say that filters are part of the current regulatory status quo. The need for parental cooperation does not automatically disqualify a proposed less restrictive alternative. In enacting COPA, Congress said its goal was to prevent the “widespread availability of the Internet” from providing “opportunities for minors to access materials through the World Wide Web in a manner that can frustrate parental supervision or control.” COPA presumes that parents lack the ability, not the will, to monitor what their children see. By enacting programs to promote use of filtering software, Congress could give parents that ability without subjecting protected speech to severe penalties.

I also agree and conclude that in conjunction with the private use of filters, the government may promote and support their use by, for example, providing further education and training programs to parents and caregivers, giving incentives or mandates to ISP’s to provide filters to their subscribers, directing the developers of computer operating systems to provide filters and parental controls as a part of their products (Microsoft’s new operating system, Vista, now provides such features, see Finding of Fact 91), subsidizing the purchase of filters for those who cannot afford them, and by performing further studies and recommendations regarding filters.

Adult supervision, child education on online safety & literacy, parental controls & filters are more effective at less expense to fundamental rights. Governments know this & conveniently forget it.

load more comments (3 replies)
load more comments (14 replies)
[–] bitjunkie@lemmy.world 76 points 1 week ago (5 children)

Putting this in fixed-width for scale:

This ruling:                        375,000,000
Meta valuation:               1,618,000,000,000

This isn't even a slap on the wrist; it's a fucking rounding error.

[–] 7101334@lemmy.world 54 points 1 week ago

Phrased in another way, it's equivalent to if you had $1,618 in the bank and were fined $0.30.

[–] rumba@lemmy.zip 16 points 1 week ago

Super small compared to their income, but a GREAT reason to make all the users age validate.

load more comments (3 replies)
[–] ExLisper@lemmy.curiana.net 63 points 1 week ago (2 children)

The jury ordered Meta to pay the maximum penalty under the law of $5,000 per violation, totaling $375m in civil penalties for violating New Mexico’s consumer protection laws.

Meta: I guess I will only be able to spend $79.635.000.000 on my next useless venture.

[–] Sunflier@lemmy.world 28 points 1 week ago (2 children)

I'ma bet that they spend 10 million of that 79 billion on bribes to change the law so this never happens to them again.

load more comments (2 replies)
load more comments (1 replies)
[–] Boiglenoight@lemmy.world 53 points 1 week ago (4 children)

Facebook made 200 billion in revenue in 2025.

https://stockanalysis.com/stocks/meta/revenue/

They were fined $375 million. They averaged $550 million per day last year.

If social media companies were required to moderate their content…if they were responsible for what’s posted…all problems would go away.

As it stands bad actors use bots to stay one step ahead of automated moderation.

load more comments (3 replies)
[–] melsaskca@lemmy.ca 45 points 1 week ago (4 children)

Good! Remember though, fines don't count anymore, only hard time. Remove some years from these fuckers lives and they'll think twice in the future.

[–] Tenderizer@aussie.zone 3 points 6 days ago

This lawsuit is about end-to-end encryption and the lack of age verification on Instagram. So not good.

[–] Goodlucksil@lemmy.dbzer0.com 19 points 1 week ago (4 children)

Do I have to remind everyone the ending of The Wolf of Wall Street?

Tap for spoilerRich people go to ricb people prisons that aren't really prisons and are better than your house.

load more comments (4 replies)
load more comments (2 replies)
[–] phoenixz@lemmy.ca 35 points 1 week ago (10 children)

jury finds firm misled consumers over safety and enabled harm against users

If I do something like this, I go to jail

WHY THE FUCK IS ZUCKERBERG NOT IN JAIL?

[–] impdroid@lemmy.world 4 points 6 days ago

Billionaires bought a jail free cards decades ago

load more comments (9 replies)
[–] Shanmugha@lemmy.world 26 points 1 week ago (1 children)

Oh no, "child protection" was never about protecting children? I am shocked, shocked

load more comments (1 replies)
[–] XLE@piefed.social 23 points 1 week ago (2 children)

Unfortunately, part of the court's decision was that Facebook wasn't surveilling people enough.

The New Mexico court heard how Meta’s 2023 decision to encrypt Facebook Messenger – its direct messaging platform, which predators have used as a tool to groom minors and exchange child abuse imagery – blocked access to crucial evidence of these crimes.

load more comments (2 replies)
[–] BarneyPiccolo@lemmy.today 23 points 1 week ago* (last edited 1 week ago) (7 children)

Fine Zuckerfuck his entire net worth AND Meta. He's poor now.

Now, let's take a look at Musk, Bezos, and Ellison.

load more comments (7 replies)
[–] TwilitSky@lemmy.world 21 points 1 week ago

You mean we shouldn't have put our children's safety in the waxy grasp of a sentient Annabel in a t-shirt and jeans?

[–] BlackCat@piefed.social 20 points 1 week ago

Meta has generated high volumes of “junk” reports by overly relying on AI to moderate its platforms, investigators said. These reports were useless to law enforcement, and meant crimes could not be investigated, they said.

shocker.

load more comments
view more: next ›