theterrasque

joined 2 years ago
[–] theterrasque 2 points 10 months ago

Yes, which has improved some tasks measurably. ~20% improvement on programming tasks, as a practical example. It has also improved tool use and agentic tasks, allowing the llm to plan ahead and adjust it's initial approach based on later parts.

Having the llm talk through the tasks allows it to improve or fix bad decisions taken early based on new realizations on later stages. Sort of like when a human thinks through how to do something.

[–] theterrasque 3 points 10 months ago

We'll still have models like deepseek, and (hopefully) discount used server hardware

[–] theterrasque 12 points 10 months ago

I've seen some saying that "lifetime" refers to product lifetime, which is not expected to be more than X years. So yeah, slimes gonna slime

[–] theterrasque 3 points 10 months ago

Whoa, this isn't wood shop class?

[–] theterrasque 6 points 11 months ago

"South American shot after opening fire on police searching for illegal immigrant gang members"

FTFY

[–] theterrasque 1 points 11 months ago

Well, it wasn't a comment on the quality of the model, just that the context limitation has already been largely overcome by one company, and others will probably follow (and improve on it further) over time. Especially as "AI Coding" gets more marketable.

That said, was this the new gemini 2.5 pro you tried, or the old one? I haven't tried the new model myself, but I've heard good things about it.

[–] theterrasque 2 points 11 months ago

Yeah, I've been seeing the same. Purely economically it doesn't make sense with junior developers any more. AI is faster, cheaper and usually writes better code too.

The problem is that you need junior developers working and getting experience, otherwise you won't get senior developers. I really wonder how development as a profession will be in 10 years

[–] theterrasque 1 points 11 months ago (2 children)

Working on a big codebase, I don't even get the idea to ask an AI, you just can't feed enough context to the AI that it's really able to generate meaningful code...

That's not a hard limit, for example google's models can handle 2-million-token context window.

https://ai.google.dev/gemini-api/docs/long-context

[–] theterrasque 3 points 11 months ago

AI isn't ready to replace programmers, engineers or IT admins yet.

On the other hand.. it's been about 2.5 years since chatgpt came out, and it's gone from you being lucky it could write a few python lines without errors to being able to one shot a mobile phone level complexity game, even with self hosted models.

Who knows where it'll be in a few years

[–] theterrasque 3 points 11 months ago

Well, anything else just wouldn't be Christian, you know. I'd hate to have to report you..

[–] theterrasque 5 points 11 months ago

In the wise words of Londo Mollari

Only an idiot fights a war on two fronts. Only the heir to the throne of the kingdom of idiots would fight a war on twelve fronts.

[–] theterrasque 4 points 1 year ago (1 children)

Since I already use ZFS for my data storage, I just created a private dataset for sensitive data. I also have my services split based on if it's sensitive or not, so the non sensitive stuff comes up automatically and the sensitive stuff waits for me to log in and unlock the dataset.

view more: ‹ prev next ›