this post was submitted on 28 Jul 2025
17 points (94.7% liked)

homeassistant

15603 readers
3 users here now

Home Assistant is open source home automation that puts local control and privacy first.
Powered by a worldwide community of tinkerers and DIY enthusiasts.

Home Assistant can be self-installed on ProxMox, Raspberry Pi, or even purchased pre-installed: Home Assistant: Installation

Discussion of Home-Assistant adjacent topics is absolutely fine, within reason.
If you're not sure, DM @GreatAlbatross@feddit.uk

founded 2 years ago
MODERATORS
 

I recently integrated self-hosted Ollama with Home Assistant and it's pretty cool.

I'd like to also be able to ask the Home Assistant Assist things like "What movies are playing in MyCity?" and have the model in Ollama search the web and summarize the results. I use Open-WebUI as a frontend for Ollama and it can do that, but I can't see how configure HA's Assist to do it.

Has anyone done this?

Another example would be to ask "what's the weather today?", and I realize that HA has a weather integration built in and thus I wouldn't need Ollama to search the web for results. Is there a web search integration for HA, so that any question (like "what movies are playing") could be passed to it?

My ultimate goal is to be able to ask questions about the world (i.e., search the web and summarize the results) and control Home Assistant devices. It doesn't have to be HA's Assist, but if I were to use Open-WebUI* for example, I'd have to come up with a way to link my audio-listening devices to it. It also doesn't have to be one "entity", but it should "feel like" one entity to the user (i.e., I don't want to have to speak to a different device or have a different wakeword to ask about the world than I do to control Home Assistant).

*Open-WebUI has integrations to control Home Assistant.

you are viewing a single comment's thread
view the rest of the comments
[โ€“] scrubbles@poptalk.scrubbles.tech 6 points 5 days ago (1 children)

So you're getting into function/tool calling with llms. If you don't know, they are structured things that you give to your LLM (at the API/model level, not you directly) that says "I have coded a function called get weather, if the user asks about the weather, let me know by returning a tool call response asking for the weather and I will get it for you" (in layman's terms).

This is how HA works too, if you ask the LLM to do something it is essentially running a tool call to HA for you.

Now, for your question, I'm betting there are ways to add in more, if HA supports this, idk. However, that's what you're looking for. It's also kinda named Agents, like you would create a web agent which takes in what the LLM asks for, gets the top 10 responses, collates them and gives them back to the LLM for summarization.

I'm guessing others have done this, but now you know what you're looking for!

Thanks for the explanations, @Scrubbles@poptalk.scrubbles.tech.

I introduced myself to tools/functions in Open-WebUI: there is a community library of tools/functions and you can essentially copy/paste them into Open-WebUI and then they're available there. If I understand correctly, tools add functionality to models and functions add functionality to Open-WebUI.

I'm still confused, though: if I add a web search tool to Open-WebUI (some do exist), I would guess that it could only ever add functionality to the model if I access the model through Open-WebUI - but HA is accessing the model directly through Ollama.