this post was submitted on 05 Jun 2024
72 points (91.9% liked)

homeassistant

14828 readers
29 users here now

Home Assistant is open source home automation that puts local control and privacy first.
Powered by a worldwide community of tinkerers and DIY enthusiasts.

Home Assistant can be self-installed on ProxMox, Raspberry Pi, or even purchased pre-installed: Home Assistant: Installation

Discussion of Home-Assistant adjacent topics is absolutely fine, within reason.
If you're not sure, DM @GreatAlbatross@feddit.uk

founded 2 years ago
MODERATORS
top 15 comments
sorted by: hot top controversial new old
[–] scrubbles@poptalk.scrubbles.tech 29 points 1 year ago (1 children)

Great, but it's restrictive only letting you use openai and google. I'm already hosting oogabooga text generation, let me use that

[–] Zikeji@programming.dev 19 points 1 year ago (2 children)

I believe that's because those two APIs support function calling, open source support is still coming along.

Ah that makes sense. That's when I'd start using it myself. Self hosted models and audio

[–] wagesj45@kbin.run 3 points 1 year ago (1 children)

Mistral Instruct v0.3 added in function calling, but I don't know if its method for implementation is the same/compatible. Also, it is fairly new and wasn't released all that long ago. Hopefully we'll get there soon. :)

[–] Zikeji@programming.dev 2 points 1 year ago (1 children)

I saw a few others, but the ones I looked at were basically instruct layers where you'd need to add your own parser. I didn't find anything (in my 3 minutes of searching) that offers an openai chat completions endpoint, which is probably the main stopper.

[–] wagesj45@kbin.run 1 points 1 year ago

Looking at the documentation it looks like it relies on Mistral's python tooling to work. I'm fairly dumb, so I don't know if the tool suggestion coming from Mistral is from some kind of separate neural net or as some kind of special response you have to parse (or that their client parses for you?).

[–] geophysicist@discuss.tchncs.de 10 points 1 year ago (4 children)

Okay but when can we use the weather forecast on our dashboards? Functionality was retired with no replacement

[–] Munkisquisher@lemmy.nz 8 points 1 year ago

The free data source was cut off, there's several replacements of varying quality depending on region. The met.no one is good for me.

[–] rach@lemmy.unryzer.eu 1 points 1 year ago

It's still working for me though I remember I needed to change the entity id and a automation a while back

[–] Buelldozer@lemmy.today 1 points 1 year ago (1 children)

What are you referring to when you say "when can we use the weather forecast on our dashboards? "

I've probably got the simplest and most "Out of the Box" dashboard stuff going on you can imagine and I've got forecast data showing with automations that run against it. What am I missing?

[–] muppeth@scribe.disroot.org 1 points 1 year ago

Ok. Npw Its definitely time to migrate my instance to something more powerful then my raspberry pi

[–] bushvin@lemmy.world 0 points 1 year ago (1 children)

Oh cool, implementing mediocre algorithms. What could possibly go wrong?

[–] warmaster@lemmy.world 7 points 1 year ago* (last edited 1 year ago)

Local LLMs have been supported via the Ollama integration since Home Assistant 2024.4. Ollama and the major open source LLM models are not tuned for tool calling, so this has to be built from scratch and was not done in time for this release. We’re collaborating with NVIDIA to get this working – they showed a prototype last week.

Are all Ollama-supported algos mediocre? Which ones would be better?