this post was submitted on 15 Oct 2025
1251 points (99.0% liked)

Microblog Memes

9582 readers
1470 users here now

A place to share screenshots of Microblog posts, whether from Mastodon, tumblr, ~~Twitter~~ X, KBin, Threads or elsewhere.

Created as an evolution of White People Twitter and other tweet-capture subreddits.

Rules:

  1. Please put at least one word relevant to the post in the post title.
  2. Be nice.
  3. No advertising, brand promotion or guerilla marketing.
  4. Posters are encouraged to link to the toot or tweet etc in the description of posts.

Related communities:

founded 2 years ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] zxqwas@lemmy.world 80 points 3 weeks ago (3 children)

Either (you genuinely belive) you are 18 (24, 36 does not matter) months away from curing cancer or you're not.

What would we as outsiders observe if they told their investors that they were 18 months away two years ago and now the cash is running out in 3 months?

Now I think the current iteration of AI is trying to get to the moon by building a better ladder, but what do I know.

load more comments (3 replies)
[–] kadu@scribe.disroot.org 54 points 2 weeks ago (6 children)

There's not a single world where LLMs cure cancer, even if we decided to give the entirety of our energy output and water to a massive server using every GPU ever made to crunch away for months.

[–] IAmNorRealTakeYourMeds@lemmy.world 40 points 2 weeks ago (12 children)

which fucking sucks, because AI was actually getting good, it could detect tumours, it could figure things fast, it could recognise images as a tool for the visually impaired...

But LLMs are non of those things. all they can do is look like text.

LLMs are an impressive technology, but so far, nearly useless and mostly a nuance.

[–] BilSabab@lemmy.world 8 points 2 weeks ago (1 children)

down in Ukraine we have a dozen or so image analysis projects that can't catch a break because all investors can think about are either swarm drones (quite understandably) or LLM nothingburgers that burn through money and dissipate every nine months. Meanwhile those image analysis projects manage to progress on what is basically scraps and leftovers.

[–] IAmNorRealTakeYourMeds@lemmy.world 7 points 2 weeks ago (1 children)

the problem is that technical people can understand the value of different AI tool. but tell an executive with a business major how mind blowing it is that a program trained in Go and StarCraft can solve protein folding (studied biology in 2010 and they kept repeating how impossible solving proteins in silico was).

But a chat bot that tells the executive how smart and special it is?

That's the winner.

load more comments (1 replies)
load more comments (11 replies)
[–] HereIAm@lemmy.world 31 points 2 weeks ago (2 children)

Not strictly LLMs, but neural nets are really good at protein folding, something that very much directly helps understanding cancer amount other things. I know an answer doesn't magically pop out, but it's important to recognise the use cases where NN actually work well.

[–] BrioxorMorbide@lemmings.world 6 points 2 weeks ago

But giving all the resources to LLMs slows/prevents those useful applications of AI.

[–] merc@sh.itjust.works 6 points 2 weeks ago (1 children)

I'm trying to guess what industries might do well if the AI bubble does burst. I imagine there will be huge AI datacenters filled with so-called "GPUs" that can no longer even do graphics. They don't even do floating point calculations anymore, and I've heard their integer matrix calculations are lossy. So, basically useless for almost everything other than AI.

One of the few industries that I think might benefit is pharmaceuticals. I think maybe these GPUs can still do protein folding. If so, the pharma industry might suddenly have access to AI resources at pennies on the dollar.

[–] MotoAsh@piefed.social 6 points 2 weeks ago (2 children)

integer calculations are lossy because they're integers. There is nothing extra there. Those GPUs have plenty of uses.

load more comments (2 replies)
[–] Quetzalcutlass@lemmy.world 8 points 2 weeks ago (2 children)

And it's clear we're nowhere near achieving true AI, because those chasing it have made no moves to define the rights of an artificial intelligence.

Which means that either they know they'll never achieve one by following the current path, or that they're evil sociopaths who are comfortable enslaving a sentient being for profit.

[–] HeyThisIsntTheYMCA@lemmy.world 8 points 2 weeks ago

they’re evil sociopaths who are comfortable enslaving a sentient being for profit.

i mean, look what is happening in the united states. that would be completely unsurprising to happen here.

load more comments (1 replies)
load more comments (3 replies)
[–] Lushed_Lungfish@lemmy.ca 28 points 2 weeks ago (2 children)

Um, human history has repeatedly demonstrated that when a new technology emerges, the two highest priorities are:

  1. How can we kill things with this?
  2. How can we bone with this?
load more comments (2 replies)
[–] bassad@jlai.lu 17 points 2 weeks ago

Ow, that's why they are restricting "organic" porn, to sell AI porn. Damn.

[–] NutWrench@lemmy.world 12 points 2 weeks ago (6 children)

If you've ever wondered why porn sites use pictures of cars, buses, stop signs, traffic lights, bicycles and sidewalks in their captchas, it's because they're using the data to train car-driving AIs to recognize those patterns.

This is not what an imminent breakthrough in cancer research looks like.

load more comments (6 replies)
[–] TankovayaDiviziya@lemmy.world 11 points 2 weeks ago (4 children)

We are closer to making horny chatbots than a superintelligence figuring out a cure for cancer.

Actually, if the latter wins, would that super AI win a Nobel prize?

[–] PrettyFlyForAFatGuy@feddit.uk 8 points 2 weeks ago

what if my kink is curing cancer?

[–] percent 7 points 2 weeks ago (2 children)

It would probably go to whoever uses it to find the cure... And to none of the authors who wrote the data that it was trained on

load more comments (2 replies)
load more comments (2 replies)
[–] Resonosity@lemmy.dbzer0.com 8 points 2 weeks ago

I appreciate that this post is using dark mode

[–] lightnsfw@reddthat.com 8 points 2 weeks ago (1 children)

What about an AI naughty nurse that does both?

[–] uberfreeza@lemmy.world 14 points 2 weeks ago (1 children)

You can use AI to fulfill your fantasies! for example: having healthcare (if you're not American, this joke does not apply)

load more comments (1 replies)
[–] Pulptastic@midwest.social 8 points 2 weeks ago (2 children)

Porn can pay your way through school, so to speak

load more comments (2 replies)
[–] frustrated@lemmy.world 8 points 2 weeks ago (4 children)

No money in curing cancer with an LLM. Heaps of money taking advantage if increasingly alienated and repressed people.

[–] Saledovil@sh.itjust.works 12 points 2 weeks ago (1 children)

You could sell the cure for a fortune. Imagine something that can reliably cure late stage cancers. You could charge a million for the treatment, easily.

[–] frustrated@lemmy.world 6 points 2 weeks ago (2 children)

Yes, selling the actual cure would be profitable...but an LLM would only ever provide the text for synthesizing it but none of the extensive testing, licensing, or manufacturing, etc.. An existing pharmaceutical company would have to believe the LLM and then front the costs for the development, testing, and manufacture...which constitutes a large proportion of the costs of bringing a treatment to market. Burning compute time on that is a waste of resources, especially when fleecing horny losers is available right now. It is just business.

load more comments (2 replies)
[–] echodot@feddit.uk 8 points 2 weeks ago (1 children)

There's loads of money in curing cancer. For one you can sell the cure for cancer to people with cancer.

load more comments (1 replies)
load more comments (2 replies)
[–] humanspiral@lemmy.ca 7 points 2 weeks ago (2 children)

FYI, using openAI/ChatGPT is expensive. Programming it to program users into dependency on its "friendship" gets them to pay for more tokens, and then why not blackmail them or coerce/honeypot them into espionage for the empire. If you don't understand yet that OpenAI is arm of Trump/US military, among its pie in the sky promises is $35B for datacenters in Argentina.

load more comments (2 replies)
[–] Kolanaki@pawb.social 6 points 2 weeks ago

But how else would it find the hard lump on yoir testicles?

load more comments
view more: next ›