In my last post, I discussed the things that are important to me when I am considering to build a new machine for my lab and my reasons for choosing certain options when building a new one.  I promised to report on what I bought and built after making the decisions above and how it went, so here it is!

Hardware:

Motherboard: Decision made in the last post was to go for a white box solution, rather than a branded one.  So, the question was: Which motherboard to go for?  As you know, there is no shortage of good motherboards with varying capabilities.  To me, supporting the latest type of processors and capacity for a large amount of memory was important.  So, I started looking at motherboards which supported “Sandy Bridge” line of processors and supported 64 GB of RAM.  There were a few available that met the criteria but I went for an Asus P9X79 PRO Motherboard.  There also is a “Deluxe” version of the motherboard available, which offers some extra connectivity and built-in WiFi but I didn’t need any of that.  Even though I am not using it initially, I quite like the SSD Caching feature and would like to keep it in my back pocket, in case my plans change for this machine.

Processor: I wanted to get the best price/performance ratio and yet didn’t want to go for something that will be outdated as soon as I buy it.  After comparing speed and the premium I would pay for it, I decided to go for an Intel Core i7 3820 Quad Core CPU.  Reasonably priced and suitable for my set up!

Memory: As mentioned before, I needed a motherboard with 64 GB of RAM.  This was to ensure that I am not restricted to a certain number of machines due to insufficient RAM capacity.  Initially, I am going to put 32 GB in and see how things work.  If required, I’ll put in the rest.  In my Xeon-based machine, quality of RAM mattered quite a bit but this time, I wanted quantity and not necessarily quality.  As long as it works and is reliable, I am happy.  For that reason, I took the cheaper option and didn’t go for the “branded” option.  I bought Komputerbay 32GB (4x 8GB) DDR3 PC3-12800 1600MHz DIMM with Blue Low Profile Heatspreaders.  They go nicely with the motherboard (in terms of colour as well :-)) and haven’t let me down yet.

CPU Cooler: I mentioned in my last post that I try to build my machines to be as quiet as possible.  To make this one quiet as well, I went for my tried and tested brand: Noctua.  Their coolers are big but very quiet and are perfect if you don’t want to go down the liquid cooling route.  This is what I bought: Noctua D14-2011 Dual Radiator PWM CPU Cooler.  Now if you’re going for this cooler, make sure it’s for your socket type as there are other versions of the same cooler.  Also, you would want to make sure that you’ll have enough clearance in your chassis and above the RAM slots after installation so that you don’t run into problems.  Once I was happy that everything will fit OK, I went for it.

Case: Initially, I was going to use the same case that I had for my retiring computer.  However, once I decided to go with the Noctua cooler, I had to go for a new one.  I started looking for something that can fit lots of fans (if required) and possibly a water cooling kit as well.  I was also thinking about proper built-in cable-management and a “tool less” chassis.  Lots of expansion capacity was also a consideration.  Bearing all this in mind, I chose Corsair Carbide Series 400R Mid-Tower Gaming Case.  It has all the features I was looking for (just look at the number of fans it can accommodate!) but also has USB 3.0 connectors on the front panel and built-in compatibility with SSD devices – something I was going to use right from the start.  It also has enough depth to easily accommodate the Noctua cooler and still have space to spare!

Power Supply: Fortunately, I was able to re-use the power supply from the old system.  I don’t have the link to it but it’s an Enermax 600W power supply.  I always buy power supplies of rating way over what I actually require.  That ensures stability even at high load and even if I add things to my machines later, the supply is able to cope easily.  One thing you don’t want in your system is a dodgy power supply!  Over-spec’d power supply also means I can re-use it years later, which I am doing in this case :-).  One issue that I do have: It does not have modular cables as such power supplies were rare in the days I bought this.  Generally, it’s not a problem but cable-management in my new system means routing them through holes in specific places, requiring longer cables.  Everything else was fine but the 12V DC connector falls a bit short if I route it properly.  For that reason, I’ve taken a short cut for now.  I plan to buy an extension soon.

Storage: As mentioned in my first post, my plan is to use an SSD with thin-provisioned VMs to have more IOPS available on the system.  The SSD I bought for this purpose is: 512GB Crucial m4 2.5-inch SATA 6Gb/s (SATA III).  Again reasonably priced but should also fit enough thin-provisioned VMs to fill up the RAM.  For VMs that may require a lot of disk space, I plan to run them on my other machine, which has a RAID 10 set.  Time (or maybe my next post :-)) will tell if this decision worked as well as I hope it will.

Graphics Card: Again don’t have a link for it as it’s also from my retiring system but this motherboard doesn’t have built-in video so you require a card for it.  In this case, it has to be a PCI Express card.  As we’re not really after graphics capability here, my old card will do nicely but if you are the gamer sort (or want to take advantage of better graphics support in VMware Workstation 9) then there is no reason why you can’t use a higher-end card.

Software:

OS: This one is easy.  With VMs, you want a stable OS and everything fully supported on the host environment.  A smaller footprint also helps.  For that reason, I’ve gone for Windows 7 and even though, I tested Windows 8 briefly as I had the chance, there is no chance I am going to run that on my new machine for now.  The bundled drivers don’t support Windows 8 as it’s too soon so I guess it was a given anyway.

Hypervisor: As mentioned in the last post, I chose VMware Workstation as the platform to run my VMs.  What I didn’t mention was that I am going for version 9.0 on the new machine.  It has native support for Windows 2012 and Hyper-V and I like simplicity. 🙂  I’ll keep version 8 on my other machine so that I have both versions running on different machines – in case, I run into version issues.

With all this hardware/software gathered, I went to my kitchen table, created a lot of space and put it all together.  There was nothing unusual about the install to write about, apart from the one issue where the 12V DC cable fell a bit short via cable-management.  After installation, the system looks like this:

Fully Built Lab Machine

Lab Machine After Installation

Due to chassis design, absence of bulky hard disks and cable-management, there is good airflow naturally and the system runs so cool that the BIOS started throwing up errors saying the fans are running too slowly!  Maybe I need to put some load on the system :-).  After that, Windows 7 was installed, updated and all drivers were installed from the support CD.  All devices were detected without issues so all drivers provided, work OK.  The system is now in my study, connected to my dual-headed KVM and working nicely.

That’s it for this post.  I do intend to have another post in the series, to report on how the system fared in comparison to the old system and if the choices made in building this system were valid or not.  This post took too long to get out, due to work commitments but I will try to get the next one out as soon as possible.

As always, please feel free to ask if there are any questions on this or post a comment to share your experiences.  It’s always good to know what other people are doing!

Update: Part III of the post is here: Building (or Upgrading) a Virtual Home Lab Machine – Part III