FaceDeer

joined 2 years ago
[–] FaceDeer@fedia.io 59 points 1 year ago

Ukraine has invaded Russia and seized more Russian territory in the past five days than the Russians have managed to take from Ukraine in months. Seemingly with very little effort or casualties.

[–] FaceDeer@fedia.io 1 points 1 year ago (7 children)

I have genuinely found LLMs to be useful in many contexts. I use them to brainstorm and flesh out ideas for tabletop roleplaying adventures, to write song lyrics, to write Python scripts to do various random tasks. I've talked with them to learn about stuff, and verified that they were correct by checking their references. LLMs are demonstrably capable of these things. I demonstrated it.

Go ahead and refrain from using them yourself if you really don't want to, for whatever reason. But exclaiming "no it doesn't!" In the face of them actually doing the things you say they don't is just silly.

[–] FaceDeer@fedia.io 1 points 1 year ago (1 children)

Then go ahead and put "science questions" into one of the areas that you don't use LLMs for. That doesn't make them useless in general.

I would say that a more precise and specific restriction would be "they're not good at questions involving numbers." That's narrower than "science questions" in general, they're still pretty good at dealing with the concepts involved. LLMs aren't good at math so don't use them for math.

[–] FaceDeer@fedia.io 0 points 1 year ago (9 children)

Your comment is simply counterfactual. I do indeed find LLMs to be useful. Saying "no you don't!" Is frankly ridiculous.

I'm a computer programmer. Not directly experienced with LLMs themselves, but I understand the technology around them and have written program that make use of them. I know what their capabilities and limitations are.

[–] FaceDeer@fedia.io 27 points 1 year ago (2 children)

I'm sure the Ukrainian soldiers are rather busy with important things of their own, but if they've got any spare bandwidth it'd be neat if they were able to help organize the Russian civilians a bit and keep this kind of lawlessness suppressed. Heck, if they're digging in for the long term they may end up needing to provide humanitarian aid for the people who chose to stay behind. That'll be quite the look.

[–] FaceDeer@fedia.io 3 points 1 year ago

Our "intelligence" agencies already kill innocent people based entirely on metadata — because they simply live or work around areas that known terrorists occupy — now imagine if an AI was calling the shots.

So by your own scenario, intelligence agencies are already getting stuff wrong and making bad decisions using existing methodologies.

Why do you assume that new methodologies that involve LLMs will be worse at that? Why could they not be better? Presumably they're going to be evaluating their results when deciding whether to make extensive use of them.

"Mathematical magic tricks" can turn out to be extremely useful. That phrase can be used to describe all manner of existing techniques that are undeniably foundational to civilization.

[–] FaceDeer@fedia.io -1 points 1 year ago (14 children)

Except it is capable of meaningfully doing so, just not in 100% of every conceivable situation. And those rare flubs are the ones that get spread around and laughed at, such as this example.

There's a nice phrase I commonly use, "don't let the perfect be the enemy of the good." These AIs are good enough at this point that I find them to be very useful. Not perfect, of course, but they don't have to be as long as you're prepared for those occasions, like this one, where they give a wrong result. Like any tool you have some responsibility to know how to use it and what its capabilities are.

[–] FaceDeer@fedia.io 68 points 1 year ago (22 children)

I expect if you follow the references you'd find one of them to be one of those "if Earth was a grain of sand" analogies.

People like laughing at AI but usually these silly-sounding answers accurately reflect the information the search returned.

[–] FaceDeer@fedia.io 51 points 1 year ago (2 children)

Whatever it was, it wasn't pure mercury. The victim suffered a severe reaction. Some organic mercury compounds are incredibly dangerous and can soak in easily through the skin. I'm surprised she's only facing three years, this sounds like attempted murder.

[–] FaceDeer@fedia.io 74 points 1 year ago

A surprise, to be sure. But a welcome one.

[–] FaceDeer@fedia.io 7 points 1 year ago (2 children)

Some people are into humiliation, don't kinkshame.

[–] FaceDeer@fedia.io 4 points 1 year ago

I run tabletop roleplaying adventures and LLMs have proven to be great "brainstorming buddies" when planning them out. I bounce ideas back and forth, flesh them out collaboratively, and have the LLM speak "in character" to give me ideas for what the NPCs would do.

They're not quite up to running the adventure themselves yet, but it's an awesome support tool.

view more: ‹ prev next ›