null

joined 5 days ago
[–] null@piefed.au 2 points 4 days ago

Yeah right. I guess if you process it so it's just calcium or something rather than living tissue.

[–] null@piefed.au 7 points 4 days ago (1 children)

I don't know the answer and I don't know anything about how LLMs are tuned but I think the answer is probably partially yes.

My supposition is:

Instead of providing manual answers to specific questions, you modify the bot's approach to answering different types of questions.

For example, if you ask "what color are bananas" the bot answers this by looking for discussions about the color of different fruits and selects the word that seems to be provided most often.

Alternatively, if you ask "what is two plus two", when the bot parses the question it recognises that it's a math question, so instead of looking for text discussions of math, it converts it to an equation and returns the solution.

Previously, I guess bots were answering the "how many r's" question in the text based kind of way, and the fix made the bot interpret it in a more mechanical / mathematic kind of way.

It's a pretty salient demonstration of a bot's inability to reason. They're good at making sentences, but they can only emulate reasoning.

[–] null@piefed.au 3 points 4 days ago

I find this really hard to believe.

I'm sure that unauthorised organ harvesting has occurred in isolated circumstances.

But I'm incredulous that it could happen on an industrial scale.

[–] null@piefed.au 2 points 4 days ago (2 children)

I didn't realise that this was a thing.

I guess your body just kind of tolerates bone for some reason? Usually for transplants you need meds to suppress your immune system forever.

view more: ‹ prev next ›