Custom-Built AI Server - Thoughts?
reddit.com·3h·
Discuss: r/LocalLLaMA
Flag this post

I’m working on the hardware selection to build an AI server to host several different AI instances with different models ranging from text-based to basic image generation. I want to be able to run models to at least 70B parameters and have some room to expand in the future (via hardware upgrades). This is what I have in mind:

CPU: AMD EPYC 7282

2.8Ghz base, 3.2Ghz max turbo

16cores, 32threads

85.3GB/s memory bandwidth

RAM: 128GB DDR4-3200Mhz

4x32GB sticks

Upgradable to 4TB (aiming for 256GB or 512GB if needed)

Motherboard: AsRock Rack ROMED8-2T

8x RAM slots, max 3200Mhz

7x PCIe 4.0 x16

GPU: 2x Nvidia RTX 3090

48GB VRAM total

Motherboard can support two more if needed

OS: Either TalosOS or Debian w/ Docker

Using Nvidia drivers to bridge GPUs…

Similar Posts

Loading similar posts...