this post was submitted on 07 Mar 2026
937 points (99.1% liked)

Technology

83295 readers
5070 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
(page 2) 50 comments
sorted by: hot top controversial new old
[–] ArbitraryValue@sh.itjust.works 4 points 3 weeks ago (1 children)

If you don't want legal or medical advice from an AI, you can already simply not ask the AI for legal or medical advice. But I don't want your paternalistic restrictions on what I may ask.

[–] moroninahurry@piefed.social 2 points 3 weeks ago

Sir did you pay for that medical advice though? That's what these laws will eventually enforce. Prescription advice.

[–] henfredemars 4 points 3 weeks ago (11 children)

Mixed feelings about this. Let me play devils advocate and say that many Americans don’t have access to these resources at all. Having potentially inaccurate resources might be better than nothing, or is that worse?

[–] smh@slrpnk.net 2 points 3 weeks ago

We had a medical scare just yesterday. I was in the ER for 8 hours with my partner over a non-life-threatening but still emergency problem.

An ultrasound, cat scan, and much poking and prodding later, we still don't know what is up. The AI was at least able to predict next steps (if A then discharge and follow up with PCP, if B then surgery this week, if C then emergency surgery), something the ER was too busy to do for several hours. It was reassuring. The AI also gave me (working) links to more thorough resources on the topic.

load more comments (9 replies)
[–] TropicalDingdong@lemmy.world 4 points 3 weeks ago* (last edited 3 weeks ago) (23 children)

I mean.

Is the wikipedia responsible for you reading an article about a law and then taking that as legal advice?

[Edit: if you are downvoting this, downvote away, but you owe an argument below as to why. I promise this exact argument will come up in the courts over this issue]

load more comments (23 replies)
[–] webkitten@piefed.social 3 points 3 weeks ago (2 children)

This bill gave us the "best" interaction:

https://bsky.app/profile/badmedicaltakes.bsky.social/post/3mghyg5eufk2m

A Bluesky skeet from @badmedicaltakes.bsky.social:

"Twitter user eoghan:

How dare poor people get free medical advice

<quote tweet from Twitter user Polymarket: BREAKING: New York bill would ban AI from answering questions related to medicine, law, dentistry, nursing, psychology, social work, engineering, & more.>

Twitter user YBrogard79094:
JUST MAKE HEALTHCARE ACCESSIBLE

Twitter user eoghan:

AI is literally free healthcare. Being a communist must be exhausting"

[–] Hiro8811@lemmy.world 4 points 3 weeks ago

You can google your simptoms and there probably are some reliable sites but a hallucinating chatbot is a bad idea. Not to mention some people suggested treating covid with chlorine, vinegar etc....

[–] deliriousdreams@fedia.io 3 points 3 weeks ago

Some horses you can't even lead to water. Let alone make them drink.

[–] NutWrench@lemmy.world 3 points 3 weeks ago

Chat bots should never give medical advice. Chat bots dispense basic, standalone factoids, like "aspirin is a pain reliever." But they don't know or care about dosages, comorbid conditions or whether or not you live or die, so they won't ask follow up questions.

load more comments
view more: ‹ prev next ›