this post was submitted on 17 Apr 2025
1 points (100.0% liked)

Home Assistant

412 readers
1 users here now

Home Assistant is open source home automation that puts local control and privacy first. Powered by a worldwide community of tinkerers and DIY...

founded 2 years ago
MODERATORS
 
This is an automated archive made by the Lemmit Bot.

The original was posted on /r/homeassistant by /u/alin_im on 2025-04-16 14:43:16+00:00.


Which Local LLM do you use? How many GB of VRAM do you have? Which GPU do you use?

EDIT: I know that local LLMs and voice are in infancy, but it is encouraging to see that you guys use models that can fit within 8GB. I have a 2060 super that I need to upgrade and I was considering to use it as an AI card, but I thought that it might not be enough for a local assistant.

EDIT2: Any tips on optimization of the entity names?

no comments (yet)
sorted by: hot top controversial new old
there doesn't seem to be anything here