this post was submitted on 28 Jul 2025
17 points (94.7% liked)

homeassistant

15603 readers
4 users here now

Home Assistant is open source home automation that puts local control and privacy first.
Powered by a worldwide community of tinkerers and DIY enthusiasts.

Home Assistant can be self-installed on ProxMox, Raspberry Pi, or even purchased pre-installed: Home Assistant: Installation

Discussion of Home-Assistant adjacent topics is absolutely fine, within reason.
If you're not sure, DM @GreatAlbatross@feddit.uk

founded 2 years ago
MODERATORS
 

I recently integrated self-hosted Ollama with Home Assistant and it's pretty cool.

I'd like to also be able to ask the Home Assistant Assist things like "What movies are playing in MyCity?" and have the model in Ollama search the web and summarize the results. I use Open-WebUI as a frontend for Ollama and it can do that, but I can't see how configure HA's Assist to do it.

Has anyone done this?

Another example would be to ask "what's the weather today?", and I realize that HA has a weather integration built in and thus I wouldn't need Ollama to search the web for results. Is there a web search integration for HA, so that any question (like "what movies are playing") could be passed to it?

My ultimate goal is to be able to ask questions about the world (i.e., search the web and summarize the results) and control Home Assistant devices. It doesn't have to be HA's Assist, but if I were to use Open-WebUI* for example, I'd have to come up with a way to link my audio-listening devices to it. It also doesn't have to be one "entity", but it should "feel like" one entity to the user (i.e., I don't want to have to speak to a different device or have a different wakeword to ask about the world than I do to control Home Assistant).

*Open-WebUI has integrations to control Home Assistant.

you are viewing a single comment's thread
view the rest of the comments
[โ€“] just_another_person@lemmy.world 1 points 5 days ago (1 children)

Would need more information on what you've done here, but I assume you just used the default add-on, which does not handle asynchronous requests like that out of the box. From the very limited docs, it does mention building 'Sentence Triggers'. You'd need to probably do a few things:

  • Tweak the timeout for responses to allow time for it get a formed payload and send it back
  • Look at other plugins as examples to form custom sentence triggers
  • Set some sort of boundaries to decide when it expects a response from the remote Ollama instance versus using internal data as a response

Just looking at the code here, none of the currently configured interfaces would be expecting any sort of response back from Ollama such as a web search. It's only using it for local inference about entities and performed sentences within HA itself.

Thanks for the reply and the assistance, @just_another_person@lemmy.world.

I'm not using any add-ons in Home Assistant for this integration, just the Ollama integration (but I suspect that's what you meant).

Its docs say that it doesn't integrate with sentence triggers, so I think this suggestion won't work. :(