Technically correct (tm)
Before you get your hopes up: Anyone can download it, but very few will be able to actually run it.
Technically correct (tm)
Before you get your hopes up: Anyone can download it, but very few will be able to actually run it.
Ah, that's a wonderful use case. One of my favourite models has a storytelling lora applied to it, maybe that would be useful to you too?
At any rate, if you'd end up publishing your model, I'd love to hear about it.
Yeah, there's a massive negative circlejerk going on, but mostly with parroted arguments. Being able to locally run a model with this kind of context is huge. Can't wait for the finetunes that will result from this (*cough* NeverSleep's *-maid models come to mind).
Agreed. So in other words - everybody wins.
I'm by no means under the impression that librewolf will take over to become more dominant than Firefox anytime soon. So if Firefox does the heavy lifting and does the dirty work, the community will still benefit from these better versions downstream.
I haven't given it a very thorough testing, and I'm by no means an expert, but from the few prompts I've ran so far, I'd have to hand it to Nemo concerning quality.
Using openrouter.ai, I've also given llama3.1 405B a shot, and that seems to be at least on par with (if not better than) Claude 3.5 Sonnet, whilst being a bit cheaper as well.
Orrr... It's like saying Firefox should keep on doing whatever it's doing, and people who care will get its benefits without having to suffer its drawbacks.
Get downstreamed into librefox.
128k token context is pretty sweet. Mistral nemo also just launched with a similar context. Good times.
The problem with LLMs is that they require immense compute power.
To train. But you can run a relatively simple one like phi-3 on quite modest hardware.
It is my understanding that this driver had not been (re) certified by Microsoft, though. So in that case, I stand by my statement.
If it had been, I'd agree with that blame.
^ Mocks people for using the default, then proceeds to not give alternatives they deem better.
WAKE UP!
It works offline. When you use with ollama, you don't have to register or agree to anything.
Once you have downloaded it, it will keep on working, meta can't shut it down.