Right, hopefully this doesn’t tick the "low effort post" box, but I think this is specific enough to me that it falls under the definition of help.
For context, I built myself a Threadripper machine with a pair of RTX A5000s in it a while ago, put Proxmox on it and spun up the usual Ollama, OpenwebUI and ComfyUI in an LXC. I dismantled that box to make a few changes. It’s been sitting doing nothing for most of this year.
Current spec:
Threadripper 3960x
RTX A5000 x2
128gb of DDR4
Proxmox installation is still on it, but I’ve borked enough stuff learning how things work that it’s pretty much toast. I’ve forgotten all of the things I was in the middle of and now it’s a mess, so I’d like to start over.
10gb SFP NIC
My question is this - Is Proxmox still the way to g…
Right, hopefully this doesn’t tick the "low effort post" box, but I think this is specific enough to me that it falls under the definition of help.
For context, I built myself a Threadripper machine with a pair of RTX A5000s in it a while ago, put Proxmox on it and spun up the usual Ollama, OpenwebUI and ComfyUI in an LXC. I dismantled that box to make a few changes. It’s been sitting doing nothing for most of this year.
Current spec:
Threadripper 3960x
RTX A5000 x2
128gb of DDR4
Proxmox installation is still on it, but I’ve borked enough stuff learning how things work that it’s pretty much toast. I’ve forgotten all of the things I was in the middle of and now it’s a mess, so I’d like to start over.
10gb SFP NIC
My question is this - Is Proxmox still the way to go? I’ve got a TrueNAS box that’s running a bunch of docker containers, I’ve been messing around with some LLM docker containers using the GPU that’s in my NAS, I’d like to move to a situation where the NAS continues to host my docker containers and uses the AI horsepower from this machine through an API.
With that in mind, I’m wondering whether I’d be better off doing a bare metal installation and running it that way. The only contention with that idea is that I was also running a few VMs using the AI workstation and another Arc GPU that’s installed in it (on passthrough).
I want to make the most of what I’ve got, in a way that I can integrate with everything else on my network. Running ComfyUI in docker on this machine is about the only consideration that makes me wonder if sticking with an LCX is the way to go, though I’ll be dumping all of the output onto a mounted Samba share now.
I’m about 12 months out of the loop on where the tools are, so the TL;DR is "what’s the best way to start over?"