this post was submitted on 04 Jan 2026
1 points (100.0% liked)

Home Assistant

258 readers
2 users here now

Home Assistant is open source home automation that puts local control and privacy first. Powered by a worldwide community of tinkerers and DIY...

founded 2 years ago
MODERATORS
 
This is an automated archive made by the Lemmit Bot.

The original was posted on /r/homeassistant by /u/JonathanDawdy on 2026-01-03 21:00:36+00:00.


Since yet again prices on gpus will raise I am hoping to get a jump start on setting up an local llm. It doesn't need to be intense. I don't want to buy a 30 series or up gpu if I absolutely don't have to.

It will primarily be used to replace Google home for voice to action automation. (Eg turning off the lights ๐Ÿ˜‚) It doesn't need to be able to generate images or 500 page books. Basic Q and A is fine. Response time is also pretty important. It should not take 45 seconds to turn off a light with a 5 word command. While photo generation is a plus it is not a priority.

I don't want to pay for a cloud based solution because the whole point is to disconnect myself from reliance on cloud based systems. Google home goes down weekly and fails the simplest commands whilst selling my data and making money off me.

I hear home assistant has native llama integration so that will probably be what I use.

Can I use old hardware? Do I need a gpu? Is there an os that can run the llm top level or must I install Linux/windows? My budget is whatever I can convince myself to spend but within reason. In no world am I buying 2 5090's and an epic CPU just for it to tell me the weather.

Thank you for reading my speal and I appreciate the support.

no comments (yet)
sorted by: hot top controversial new old
there doesn't seem to be anything here