ech

joined 1 month ago
[–] ech@lemmy.ca 8 points 3 weeks ago (6 children)
[–] ech@lemmy.ca 7 points 3 weeks ago

Their "definition" is wrong. They don't get to redefine words to support their vague (and also wrong) suggestion that llms "might" have consciousness. It's not "difficult to say" - they don't, plain and simple.

[–] ech@lemmy.ca 8 points 3 weeks ago

Except these algorithms don't "know" anything. They convert the data input into a framework to generate (hopefully) sensible text from literal random noise. At no point in that process is knowledge used.

[–] ech@lemmy.ca 12 points 3 weeks ago (9 children)

but like calling it a lie is the most efficient means to get the point across.

It very much doesn't because it enforces the idea that these algorithms know anything a or plan for anything. It is entirely inefficient to treat an llm like a person, as the clown in the screenshots demonstrated.

[–] ech@lemmy.ca 10 points 3 weeks ago

Correct. Because there is no "pursuit of untruth". There is no pursuit, period. It's putting words together that statistically match up based on the input it receives. The output can be wrong, but it's not ever "lying", even if the words it puts together resemble that.

[–] ech@lemmy.ca 13 points 3 weeks ago

Demanding the algorithm apologize is off the charts unhinged. It's amazing that people this stupid have achieved enough to fail this badly.

[–] ech@lemmy.ca 69 points 3 weeks ago* (last edited 3 weeks ago) (1 children)

They're trying to insinuate that the accusations on them are of the same legitimacy of those they've been throwing at everyone for decades.

If their claims are rebuked as baseless, they'll call the claims against them baseless as well. If the claims against them are said to be substantiated, they'll point to their claims and suggest the same. It's the "tails I win, heads you lose" of a political scheme.

[–] ech@lemmy.ca 22 points 3 weeks ago* (last edited 3 weeks ago)

Step 1. Input code/feed into context/prompt

Step 2. Automatically process the response from the machine as commands

Step 3. Lose your entire database

[–] ech@lemmy.ca 80 points 3 weeks ago (29 children)

Both require intent, which these do not have.

[–] ech@lemmy.ca 206 points 3 weeks ago (35 children)

Hey dumbass (not OP), it didn't "lie" or "hide it". It doesn't have a mind, let alone the capability of choosing to mislead someone. Stop personifying this shit and maybe you won't trust it to manage crucial infrastructure like that and then suffer the entirely predictable consequences.

[–] ech@lemmy.ca 3 points 3 weeks ago

Because the greater populace has fallen for their aim to redefine the word. It used to be used to declare awareness of oppression and prejudice. Now it's a word to avoid or denounce, even by those that should be embracing it.

[–] ech@lemmy.ca 22 points 3 weeks ago (1 children)
view more: ‹ prev next ›