this post was submitted on 23 May 2025
71 points (97.3% liked)

Selfhosted

46672 readers
400 users here now

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.

Rules:

  1. Be civil: we're here to support and learn from one another. Insults won't be tolerated. Flame wars are frowned upon.

  2. No spam posting.

  3. Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it's not obvious why your post topic revolves around selfhosting, please include details to make it clear.

  4. Don't duplicate the full text of your blog or github here. Just post the link for folks to click.

  5. Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).

  6. No trolling.

Resources:

Any issues on the community? Report it using the report flag.

Questions? DM the mods!

founded 2 years ago
MODERATORS
 

I currently have a Synology 220+ and a couple of VPS's, and I'm looking to consolidate, while getting out of Synology's walled garden. I've already got a couple of 3.5's in the Synology, and 4 2.5's lying around and I'm planning on running a number of docker containers and a couple of vms.

That said, I've never built anything before, and basically just went to PCPartPicker, started with the case, and checked 5-stars on each component and went from there. So... how absurd is my build?

PCPartPicker Part List

Type Item Price
CPU AMD Ryzen 5 5600X 3.7 GHz 6-Core Processor $135.00 @ Amazon
CPU Cooler Cooler Master MasterLiquid 360L Core ARGB Liquid CPU Cooler $90.71 @ Amazon
Motherboard MSI MAG B550 TOMAHAWK ATX AM4 Motherboard $165.99 @ B&H
Memory TEAMGROUP T-Force Vulcan Z 16 GB (2 x 8 GB) DDR4-3200 CL16 Memory $26.99 @ Amazon
Storage Seagate IronWolf NAS 8 TB 3.5" 7200 RPM Internal Hard Drive Purchased For $179.00
Storage Seagate IronWolf NAS 8 TB 3.5" 7200 RPM Internal Hard Drive Purchased For $179.00
Storage Seagate IronWolf NAS 8 TB 3.5" 7200 RPM Internal Hard Drive $159.99 @ Adorama
Case Fractal Design Meshify 2 ATX Mid Tower Case $173.89 @ Newegg
Power Supply Corsair RM650 (2023) 650 W 80+ Gold Certified Fully Modular ATX Power Supply $89.99 @ Corsair
Prices include shipping, taxes, rebates, and discounts
Total $1200.56
Generated by PCPartPicker 2025-05-23 19:32 EDT-0400
you are viewing a single comment's thread
view the rest of the comments
[–] themadcodger@kbin.earth 1 points 2 days ago (8 children)

Someone else mentioned the cooler too, so that's out for sure. To be honest, I never really thought about graphics in the traditional sense, but I do need something for at least Jellyfin transcoding. And maybe a small llm. Would it be better to get a dedicated GPU or CPU with integrated?

[–] DesolateMood@lemm.ee 3 points 1 day ago (5 children)

Dedicated GPUs are obviously going to be more powerful. I've never run ai before so maybe someone else can weigh in on the requirements for it, but I can say for sure that an igpu is good enough for jellyfin transcoding. It also depends on your budget, do you want to spend the extra money just for a dedicated GPU?

If you go igpu route I think that Intel is recommended over AMD, but you should probably do extra research on that before buying

[–] yaroto98@lemmy.org 2 points 1 day ago (4 children)

My amd igpu works just fine for jellyfin. LLMs are a little slow, but that's to be expected.

[–] themadcodger@kbin.earth 1 points 1 day ago (1 children)

Yeah, I'm not sure if I really want to deal with an llm. It would mostly be for home assistant, so nothing too crazy.

[–] yaroto98@lemmy.org 2 points 1 day ago (1 children)

I have a very similar NAS I built. The Home Assistant usage doesn't really even move the needle. I'm running around 50 docker containers and chilling at about 10% cpu.

[–] themadcodger@kbin.earth 1 points 1 day ago (1 children)

The LLM for home assistant, or just HA in general doesn't move the needle? My HA is also pretty low key, but I was considering the idea of running my own small llm to use with HA to get off of OpenAI. My current AI usage is very small, so I wouldn't need too much on the GPU side I'd imagine, but I don't know what's sufficient.

[–] yaroto98@lemmy.org 2 points 1 day ago

Just home assistant doesn't move the needle. The llms hit the igpu hard and my cpu usage spikes to 70-80% when one is thinking.

But my llms i'm running are ollama and invokeai each with several different models just for fun.

load more comments (2 replies)
load more comments (2 replies)
load more comments (4 replies)