this post was submitted on 01 Aug 2025
1 points (100.0% liked)

homelab.

208 readers
1 users here now

Welcome to your friendly /r/homelab, where techies and sysadmin from everywhere are welcome to share their labs, projects, builds, etc.

founded 2 years ago
MODERATORS
 
This is an automated archive made by the Lemmit Bot.

The original was posted on /r/homelab by /u/Obvious_Service_8209 on 2025-07-31 14:52:13+00:00.


Just wanted to share something I’ve been quietly building — it started as a scheduler for AI workloads, but I think it might have broader implications.

A lot of us buy affordable GPUs (A2000s, 3060s, even 1050Tis) and stick them into a homelab setup — but no one’s really talking about software-level optimization.

I built a module that:

Monitors GPU/CPU/memory use

Uses ML to learn from your own workload patterns

Allocates tasks (inference, background jobs, etc.) to the best device over time

Falls back to CPU intelligently (especially if you're working with huge RAM pools)

Result? My dual A2000 + Threadripper Pro + 128GB ECC rig outperforms a single 3090 in many AI and inference jobs, especially over longer batch sessions.

It’s kind of like “Run:ai for homelabbers.” No Kubernetes. No fluff. Just a lightweight allocator that learns how to use your machine better.

Thinking about open-sourcing it. Curious if anyone else would use something like this — or already built their own hacky version?

Thanks!

no comments (yet)
sorted by: hot top controversial new old
there doesn't seem to be anything here