ech

joined 1 month ago
[–] ech@lemmy.ca 20 points 2 weeks ago

Ozzy, played by Weird Al, played by Daniel Radcliffe.

[–] ech@lemmy.ca 3 points 2 weeks ago* (last edited 2 weeks ago) (1 children)

I explained why the word matters in my very first comment, and several since. You're the one that started the argument on semantics, so you tell me.

[–] ech@lemmy.ca 1 points 2 weeks ago (3 children)

What is casual about the situation in the screenshots? You keep bringing that up as if it changes anything.

[–] ech@lemmy.ca 11 points 2 weeks ago

Brett Hankison, who fired 10 shots during the raid but didn’t hit anyone, was the only officer on the scene charged in the Black woman’s death.

WTF is this? A token sacrifice while the killer walks free? Fuck that.

[–] ech@lemmy.ca 5 points 2 weeks ago

You're not the boss of me, now!

[–] ech@lemmy.ca 5 points 2 weeks ago (1 children)

The 40k fascism cosplayers already made "god emperor" a thing. Might as well go the whole mile.

[–] ech@lemmy.ca 3 points 2 weeks ago* (last edited 2 weeks ago) (6 children)

No, it doesn't. Would you say a calculator "lied" to you if it output an incorrect answer? Is your watch "lying" to you when it's out of sync? No, obviously not. They're just wrong, not "telling falsehoods".

[–] ech@lemmy.ca 8 points 2 weeks ago (6 children)
[–] ech@lemmy.ca 7 points 2 weeks ago

Their "definition" is wrong. They don't get to redefine words to support their vague (and also wrong) suggestion that llms "might" have consciousness. It's not "difficult to say" - they don't, plain and simple.

[–] ech@lemmy.ca 8 points 2 weeks ago

Except these algorithms don't "know" anything. They convert the data input into a framework to generate (hopefully) sensible text from literal random noise. At no point in that process is knowledge used.

[–] ech@lemmy.ca 12 points 2 weeks ago (9 children)

but like calling it a lie is the most efficient means to get the point across.

It very much doesn't because it enforces the idea that these algorithms know anything a or plan for anything. It is entirely inefficient to treat an llm like a person, as the clown in the screenshots demonstrated.

[–] ech@lemmy.ca 10 points 2 weeks ago

Correct. Because there is no "pursuit of untruth". There is no pursuit, period. It's putting words together that statistically match up based on the input it receives. The output can be wrong, but it's not ever "lying", even if the words it puts together resemble that.

view more: ‹ prev next ›