Llama3 8b can be run at 6gb vram, and it's fairly competent. Gemma has a 9b I think, which would also be worth looking into.
theterrasque
Yep. These days the alternatives are "yes" and "ask again later", with yes being the default. "No" is not an option any more.
That's super green!
"braid made us money. We like money. Braid stopped giving us money. We want more money"
That's like saying car crash is just a fancy word for accident, or cat is just a fancy term for animal.
Hallucination is a technical term for this type of AI, and it's inherent to how it works at it's core.
And now I'll let you get back to your hating.
If they only had a teacher there with a gun, this wouldn't have been a problem at all
Isn't there a Geneva convention against inflicting such horror on an enemy?
And just to top it off, make this pythonscript a dialect of rust
Better background backups
Rework background backups to be more reliable
Hilarious for a system which main point / feature is photo backup
I worked on one where the columns were datanasename_tablename_column
They said it makes things "less confusing"
https://www.reddit.com/r/LocalLLaMA/comments/173jqpe/realtime_fallacy_detection_in_political_debates/ has something a bit similar