The Lab

Let’s get this out of the way really quickly: I’m going to try my best to avoid talking about specifics of work on this site. I am well aware of the volume of content I’m completely ignoring, but I suppose that’s a sacrifice I’m willing to make in the name of privacy.

The data center environments I’ve worked with have all had non-production and production environments. The number non-production environments tends to vary, and they have all sorts of names: staging, development, test, training, sandbox, lab, and so on. These are great to test technologies that we’re going to be working with in that environment, but if I want to learn something new, I don’t think it’s all that appropriate to use those resources. To that end, I’ve built a limited home lab that I can use.

This lab has gone through a number of iterations, but it’s been around in some form since I lived in my parents’ basement. Yes, I’ve had a “server” that was capable of virtualization since at least 2004, when it ran Microsoft Virtual PC. Let’s not talk about that one, though.

Most of my home servers since then have been repurposed gaming rigs running Proxmox VE. The first one was based on an i7-2600k with 32GB of RAM — not very server-like at all. There has only been one proper server among the lot, which was based on the SuperMicro X8DTH-IF dual-socket motherboard. It started with a pair of Xeon X5650s, which were pretty beefy, but highly inefficient. The server tripped the breaker under heavy load, so I downgraded the X5650s to L5640s, and all was well.

As the complexity of my projects increased, I found that the SuperMicro couldn’t keep up, so I grabbed a SuperMicro X9SRL-F and Xeon E5-2690v2. Yes, I took a hit in the core count department, but at least I didn’t have to buy new RAM. Power delivery concerns meant that I couldn’t run both servers, so this wasn’t a concern. This server lived in a violently green Thermaltake case with a tempered glass window, which was originally used for the i7-2600k, and it lasted a couple of years. Not a single breaker trip, though, which is nice.

Somewhere along the way I upgraded my main computer from a Ryzen 9 3900X to a 5900X, which meant I had 12 cores just sitting around doing nothing. It took a while, but I eventually settled on socketing that 3900X into an ASRock Rack B550D4-4L motherboard. That replaced the E5-2690v2, and as expected, utterly destroys it in every possible performance metric.

The main reason I chose the B550D4-4L instead of a much cheaper B550-based board was the BMC. As far as I can tell, ASRock is the only company that makes an AM4-based board with IPMI. As the 3900X lacks an iGPU, and I had a dGPU I wanted to pass through to a VM, I needed a motherboard that had a built-in display output. Also, I don’t want to go down into the crawlspace every time I want to watch the machine boot, as infrequent as that is.

Beyond that, the machine has parts from previous boxes and new ones:

  • CPU: AMD Ryzen 9 3900X
  • Cooler: Arctic Cooling Liquid Freezer II 280mm
  • Motherboard: ASRock Rack B550D4-4L
  • RAM: 128GB DDR4-3200
  • Boot Volume: 240GB Samsung 883 DCT (basically an 860 Evo)
  • VM Boot Storage: 1TB Samsung 970 Evo Plus
  • VM Data Storage: 5x 16TB Seagate Exos, RAID-Z2
  • dGPU: PNY Quadro P2200
  • Case: Fractal Design Define 7

The Quadro started off in the X9SRL board and proved to be the perfect tool to transcode videos without any weird trickery. I’ve been using that 970 Evo Plus for a while as well; it started in a Silverstone M.2 card, as the X8 and X9 boards didn’t have any M.2 slots. The only new part of the bunch was the Liquid Freezer II — the CPU and case were gaming rig material, and the rest were from eBay, for better or for worse.

So … now you know the base on which I build everything. Later on I’ll talk about the networking and VMs.