admin

joined 2 years ago
[–] admin@lemmy.my-box.dev 34 points 1 year ago* (last edited 1 year ago) (2 children)

WAKE UP!

It works offline. When you use with ollama, you don't have to register or agree to anything.

Once you have downloaded it, it will keep on working, meta can't shut it down.

[–] admin@lemmy.my-box.dev 97 points 1 year ago (35 children)

Technically correct (tm)

Before you get your hopes up: Anyone can download it, but very few will be able to actually run it.

[–] admin@lemmy.my-box.dev 1 points 1 year ago (3 children)

Ah, that's a wonderful use case. One of my favourite models has a storytelling lora applied to it, maybe that would be useful to you too?

At any rate, if you'd end up publishing your model, I'd love to hear about it.

[–] admin@lemmy.my-box.dev 7 points 1 year ago (5 children)

Yeah, there's a massive negative circlejerk going on, but mostly with parroted arguments. Being able to locally run a model with this kind of context is huge. Can't wait for the finetunes that will result from this (*cough* NeverSleep's *-maid models come to mind).

[–] admin@lemmy.my-box.dev 1 points 1 year ago* (last edited 1 year ago)

Agreed. So in other words - everybody wins.

I'm by no means under the impression that librewolf will take over to become more dominant than Firefox anytime soon. So if Firefox does the heavy lifting and does the dirty work, the community will still benefit from these better versions downstream.

[–] admin@lemmy.my-box.dev 3 points 1 year ago (1 children)

I haven't given it a very thorough testing, and I'm by no means an expert, but from the few prompts I've ran so far, I'd have to hand it to Nemo concerning quality.

Using openrouter.ai, I've also given llama3.1 405B a shot, and that seems to be at least on par with (if not better than) Claude 3.5 Sonnet, whilst being a bit cheaper as well.

[–] admin@lemmy.my-box.dev 5 points 1 year ago (2 children)

Orrr... It's like saying Firefox should keep on doing whatever it's doing, and people who care will get its benefits without having to suffer its drawbacks.

[–] admin@lemmy.my-box.dev 3 points 1 year ago (4 children)

Get downstreamed into librefox.

[–] admin@lemmy.my-box.dev 28 points 1 year ago (15 children)

128k token context is pretty sweet. Mistral nemo also just launched with a similar context. Good times.

[–] admin@lemmy.my-box.dev 15 points 1 year ago

The problem with LLMs is that they require immense compute power.

To train. But you can run a relatively simple one like phi-3 on quite modest hardware.

[–] admin@lemmy.my-box.dev 0 points 1 year ago

It is my understanding that this driver had not been (re) certified by Microsoft, though. So in that case, I stand by my statement.

If it had been, I'd agree with that blame.

[–] admin@lemmy.my-box.dev 24 points 1 year ago (4 children)

^ Mocks people for using the default, then proceeds to not give alternatives they deem better.

view more: ‹ prev next ›