jsomae

joined 1 year ago
[–] jsomae@lemmy.ml 1 points 1 year ago (8 children)

I have to hard disagree here. Laplace's law of succession does not require that assumption. It's easy to see why intuitively: if it turns out the probability is 0 (or 1) then the predicted probability from Laplace's law of succession limits to 0 (or 1) as more results come in.

[–] jsomae@lemmy.ml 1 points 1 year ago* (last edited 1 year ago) (10 children)

Ok, but by that same perspective, you could say convolutional neural networks have been around since the 80s. It wasn't until Geoffrey Hinton put them back on the map in 2012ish that anyone cared. GPT2 is when I started paying attention to LLMs, and that's 5 years old or so.

Even a decade is new in the sense of Laplace's law of succession alone indicating there's still a 10% chance we'll solve the problem in the next year.

[–] jsomae@lemmy.ml 1 points 1 year ago* (last edited 1 year ago) (12 children)

Absolutely not true. The probabilities of stupid things are very low; that's because they are stupid. If we expected such things to be probable, we probably wouldn't call them stupid.

I have plenty of evidence to believe magic isn't real. Don't mistake "no evidence (and we haven't checked)" for "no evidence (but we've checked)". I've lived my whole life and haven't seen magic, and I have a very predictive model for the universe which has no term for 'magic'.

LLMs are new, and have made sweeping, landmark improvements every year since GPT2. Therefore I have reason to believe (not 100%!) that we are still in the goldrush phase and new landmark improvements will continue to be made in the field for some time. I haven't really seen an argument that hallucination is an intractable problem, and while it's true that all LLMs have hallucinated so far, GPT4 hallucinates much less than GPT3, and GPT3 hallucinates a lot less than GPT2.

But realistically speaking, even if I were unknowledgeable and unqualified to say anything with confidence about LLMs, I could still say this: for any statement X about LLMs which is not stupid by the metric that an unknowledgeable person would be able to perceive, the probability of that statement being true about LLMs to an unknowledgeable person is 50%. We know this because the opposite of that statement, call it ¬X, would also be equally opaque to an unknowledgeable person. Given X and ¬X are mutually exclusive, and we have no reason to favor one over the other, both have probability 50%.

[–] jsomae@lemmy.ml 3 points 1 year ago

There is probably a simple explanation for this. The previous keyboard I was using is FlorisBoard, which has very poor gesture typing in my opinion. This gesture typing is leagues better than FlorisBoard's. So to me it feels like a sudden breath of fresh air.

[–] jsomae@lemmy.ml 1 points 1 year ago (14 children)

If I have no reason to believe X and no reason not to believe X, then the probability of X would be 50%, no?

[–] jsomae@lemmy.ml 6 points 1 year ago (3 children)

More discussion on lemmy here, including how to enable gesture typing (I'm very impressed with the gesture typing)

[–] jsomae@lemmy.ml 3 points 1 year ago

I am happy to hear you are enjoying lemmy.

But I think we might be mincing words here with "problem."

problem: there are too few AFAB people in the space problem: the space is unwelcoming to AFAB people

Reddit may not be (excessively) misogynistic (or transmisandric), but fewer AFAB are on reddit than we would have liked. Lemmy I think is this but even more extreme.

[–] jsomae@lemmy.ml 1 points 1 year ago (16 children)

I have no reason to believe the problem can't be solved, except insofar as it hasn't been solved yet (but LLMs only recently took off). So without a good reason to believe it's intractable, I'm at worst 50/50 on if it can be solved. Faith in the machine spirit would be if I had an unreasonably high expectation LLMs can be made not to hallucinate, like 100%.

My expectation is around 70% that it's solvable.

[–] jsomae@lemmy.ml 3 points 1 year ago (2 children)

that's my theory too. I am afab but I have a brother who got me interested in CS at a young age.

Perhaps reddit also has an AFAB problem. Or rather, the effect you're thinking of that makes cis women rare here applies equally to other afabs.

[–] jsomae@lemmy.ml 15 points 1 year ago (4 children)

I wouldn't be surprised if police are jumpier/more trigger happy in the U.S. due to the increased prevalance of guns there. But I also wouldn't be surprised if it had no effect (anyone could have a gun, after all).

[–] jsomae@lemmy.ml 2 points 1 year ago (18 children)

LLMs may fabricate things now and then but so do humans. I am not convinced the problem is intractable.

view more: ‹ prev next ›