Published 1 minute ago
Ayush Pande is a PC hardware and gaming writer. When he’s not working on a new article, you can find him with his head stuck inside a PC or tinkering with a server operating system. Besides computing, his interests include spending hours in long RPGs, yelling at his friends in co-op games, and practicing guitar.
Over the last couple of years, I’ve added pretty much every computing system I could lay my hands on into my home lab. Heck, I’ve even gone out of my way to purchase spare GPUs, RAM sticks, and even full-fledged rigs, just so I could add them to my experimentation paraphernalia. With old Xeon systems available for dirt-cheap prices, I’ve also spent a few bucks …
Published 1 minute ago
Ayush Pande is a PC hardware and gaming writer. When he’s not working on a new article, you can find him with his head stuck inside a PC or tinkering with a server operating system. Besides computing, his interests include spending hours in long RPGs, yelling at his friends in co-op games, and practicing guitar.
Over the last couple of years, I’ve added pretty much every computing system I could lay my hands on into my home lab. Heck, I’ve even gone out of my way to purchase spare GPUs, RAM sticks, and even full-fledged rigs, just so I could add them to my experimentation paraphernalia. With old Xeon systems available for dirt-cheap prices, I’ve also spent a few bucks on enterprise-grade hardware.
Truth be told, old servers have some quirks that I had to get used to when I first started using them. But despite everything, I’d never regret these purchases. If anything, I’m really glad I bought everything before the RAM apocalypse, as switching my primary Proxmox to enterprise components is just what I needed for my DIY experiments.
Related
Why I’m keeping a nearly 10-year-old desktop in the big year of 2025
My ol’ gaming rig still chugs along with the rest of my computing arsenal
I can deploy virtual guests to my heart’s content
Thanks to all the CPU cores and RAM sticks
As someone who has tinkered with different virtualization platforms using all sorts of systems, I daresay home labs are versatile enough to accommodate everything from Intel N100 mini-PCs to outdated laptops breathing their last breath. But unless I use pricey consumer hardware, I need to keep track of the system resources. Even though I’ve cut down the number of virtual machines I use for my projects, I still spin up new containers and VMs when practicing DevOps automation projects, and I have to disable different virtual guests on most systems when I use them for my Ansible and Terraform experiments.
Enterprise-oriented hardware, especially systems with dual CPUs, is free from this limitation. I’ve got a total of 24 CPU cores (more like 48 v-cores) and 64GB of memory in my main server, and that’s more than enough to spin up new virtual guests without worrying about system slowdowns. Plus, it gives extra breathing room to the NixOS tinkering hub, Windows 11 dev VM, and other virtual machines, as I can assign multiple CPU cores and RAM resources for maximum performance. What’s more, server-grade rigs tend to support ECC memory, which adds an extra layer of protection from bit flips, and since I bought it well before the RAM prices went bonkers, I paid roughly $100 for the ECC DDR4 sticks.
The extra SATA ports and PCIe slots are just as useful
Aside from the processing-centric resources, the extra connectivity options on server hardware are pretty handy for home lab tasks. The no-name mobo for my Xeon systems is limited to a 1GbE Ethernet port, though I was able to use a spare PCIe slot to add a 10 Gigabit NIC for my high-speed network switch. Likewise, I’ve added another 1TB NVMe drive to the system via a PCIe-to-NVMe card, while my USB expansion adapters also come in handy when I need to plug external I/O components into my Proxmox hub.
Then there are all the SATA ports in my rig. Honestly, many of them remain unoccupied these days, as I’ve ditched ancient HDDs for their SSD counterparts. But if I were to upgrade to a newer system for my virtualization needs, I could easily repurpose it into a powerful Network-Attached Storage server.
It can even run modern games
Provided I keep my expectations in check
Switching to a rather uncommon use case for server hardware, old enterprise rigs can be surprisingly viable for gaming. Now, I’ll never advise folks to go out of their way to pick up outdated systems – devices designed specifically for multi-core virtualization tasks – for gaming. But if you manage to spot a Xeon system released after 2016 for dirt-cheap prices and are willing to pay extra on the energy bills (I’ll get to that in a bit), it might not be a bad investment as a makeshift 1080p gaming machine, especially with the insane price tags on modern consumer hardware.
Related
I tried gaming on a cheap dual-Xeon system - here’s how it went
While server CPUs aren’t meant for gaming, pairing them with powerful GPUs can help you attain decent FPS at higher resolutions
I’ve tested my Xeon E5 2650 v4 with some modern games, and its single-core performance is far from ideal even for CPUs from the Broadwell family. However, it’s still capable of holding its own against single-player titles. Sure, you won’t get 200+ FPS in typical e-sports titles, but for the average RPG enthusiast (or even a DIY tinkerer looking to experiment with a remote VM-based gaming machine), a server processor released in the last decade is a decent option, assuming you keep your expectations grounded and go for a consumer GPU instead of a server-grade graphics card.
I had to take a few risks, though
I shudder at the thought of looking for replacements
Call me a cheapskate, but I like to use my home lab gizmos until the day they kick the bucket, before switching out the faulty parts and bringing them back into action. However, I can’t say the same about my server rigs. A few weeks ago, I had a scare when it seemed like the Xeon system was broken, though a little bit of troubleshooting revealed there wasn’t anything wrong with my rig.
That said, searching for hardware that’s compatible with my old server is a lot more cumbersome. After all, I’ve got a no-name motherboard and ECC-grade memory, and the latter is way past the affordable mark. Heck, I couldn’t even find the retailer that sold this system anymore, meaning I’d have to switch back to consumer devices or try my luck at the wild west of the Facebook marketplace if any of the server parts break down.
But everything is green on the power front, mind you
Let me preface this section by adding that server rigs tend to consume more energy than typical consumer-tier machines – especially when compared to using thin clients and mini-PCs for DIY projects. That said, there are a couple of ways to keep their electricity-guzzling nature in check. I’ve already replaced 3.5-inch HDDs with SSDs, and the extra responsiveness is another perk aside from the lowered power consumption. I’ve also enabled C-states in the mobo’s BIOS and configured the CPU governor to use the powersave profile.