NAS Performance - SPEC SFS 2014

Evaluation of the system as a storage node on the network can be done using multiple approaches. As a simple NAS accessed from a single client, Intel NASPT would work very well. There are other artificial benchmarking programs such as IOMeter and IOZone (all of which are used in our dedicated NAS reviews). However, when it comes to file servers used in business scenarios, business metrics make more sense. For example, a database administrator might wonder how many simultaneous databases could be sustained on a given machine? An administrator in a software company might want to know how many simultaneous software builds could be processed on the machine if it were to be used as a storage node. SPEC SFS 2014 allows us to evaluate systems based on such business metrics.

Prior to discussing about the various business scenarios, let us take a look at the test setup (including details of the testbed and how the file server itself was configured.

Solution Under Test Bill of Materials

  • ASRock Rack C2750D4I in a U-NAS NSC-800 (8GB RAM)
  • AnandTech NAS Testbed (64GB RAM, 1GB to each member VM
  • Netgear GSM7352S Ethernet Switch

Component Software

  • ASRock Rack C2750D4I system running Windows Storage Server 2012 R2
  • Load generators running on AnandTech NAS Testbed (10x Windows 7 VMs in a Windows Server 2008 R2 Hyper-V Installation)

Storage and File-Systems

  • ASRock Rack C2750D4I - 8x OCZ Vector 128GB SSDs : Storage Spaces with Parity Space
  • AnandTech NAS Testbed - NTFS partitions created at OS install time on OCZ Vertex 4 64GB SSDs

Transport Configuration

  • ASRock Rack C2750D4I - 2x 1GbE LAN Ports in 802.11ad LACP to Netgear GSM7352S
  • AnandTech NAS Testbed - 11x 1GbE LAN Ports to Netgear GSM7352S (1x management, 1x to each of 10 VMs)
  • All SMB benchmark traffic flowed through the Netgear GSM7352S network switch

The four business metrics that we will be looking at today include:

  • Database
  • Software Build
  • Video Data Acquisition (VDA)
  • Virtual Desktop Infrastructure (VDI)

The database and software build categories are self-explanatory. The VDA profile refers to usage of a storage node as a recording target for streaming video (usually from IP cameras). The VDI profile refers to the number of virtual desktops / virtual machines that can be supported using the file server as a storage node for the virtualization infrastructure.

Database

The following graphs show the requested and achieved op rates for the database workload. Note that beyond four databases, the gap between them is more than 10% - this automatically means that the storage system is unable to support more than four databases concurrently. In all the workloads, it is the latency which decides the suitability and not the bandwidth available.

Database Workload - Op Rates

Database Workload - Latency and Bandwidth

The SPEC SFS 2014 benchmark also provides a summary file for each workload which contains data additional to whatever is graphed above. The summary for the database workload is available here

Software Build

A similar analysis for the software build benchmark profile shows that the system is able to support up to 10 builds without any problems.

Software Build Workload - Op Rates

Software Build Workload - Latency and Bandwidth

The report summary for the software build workload is available here

Video Data Acquisition

Video data acquisition for up to 10 streams is easily handled by our DIY solution.

VDA Workload - Op Rates

VDA Workload - Latency and Bandwidth

The report summary for the VDA workload is available here

Virtual Desktop Infrastructure

VDI presents a very sorry story. The op rate achieved is not even close to the required rate, and the solution seems incapable of supporting any virtualization infrastructure.

VDI Workload - Op Rates

VDI Workload - Latency and Bandwidth

The report summary for the VDI workload is available here

Performance Metrics - Storage Subsystem Miscellaneous Aspects and Concluding Remarks
Comments Locked

48 Comments

View All Comments

  • rrinker - Monday, August 10, 2015 - link

    This chassis looks like just the thing to replace my WHS box. I was probably just going to run Server 2012 R2 Essentials and change over my StableBits DrivePool to the standard Server 2012 version. ALl these NAS boxes and storage system that everyone seems to go nuts over - none of them I've seen have the flexibility of the pooled storage that the original WHS, and WHS 2011 with DrivePool have had all along. Of course there are the Windows haters - but my WHS has been chugging along, backing up my other computers, storing my music and movies, playing movies through my media player, and the only time it's been rebooted since I moved to my new house a year and half ago was when the power went out. It just sits there and runs. One of the best products Microsoft came up with, so of course they killed it. Essentials is the closest thing to what WHS was. Replacing a standard mid tower case with something like this would save a bunch of space. 8 drives, plus a couple of SSDs for the OS drive.. just about perfect. I currently have 6 drives plus an OS drive in my WHS, so 8 would give me even more growing room. I have a mix of 1TB, 2TB, and 3TB drives in there now, with this, up to 8x 4TB which is a huge leap over what I have now.
  • DanNeely - Monday, August 10, 2015 - link

    At $400 for a (non-education) license, S2012 R2 Essentials is a lot more expensive than I want to go. If I build a new storage server on Windows I'm 99% sure I'll be starting with a standard copy of Win10 for the foundation.
  • kmmatney - Monday, August 10, 2015 - link

    The only thing missing from Windows 10 is the automated backup,which works great on WHS. That's the main thing holding me back from changing from WHS. I had to do a few unexpected bare-metal restores after installing Windows 10 on a few machines, and WHS really came through there. I had several issues restoring, but at the end of the day, it was successful in every instance.
  • kmmatney - Monday, August 10, 2015 - link

    I'm also a WHS 2011 + stablebit drivepool user. Best of everything - you can add or remove single drives easily, the data is portable and easy to extract if needed, you can choose what gets mirrored, and what doesn't. The initial balancing takes a while, but after that the speed is fine. I'm up to 8 drives now (7 in the drive pool), and can expand to 12 drives with my Corsair carbide case and a $20 SATA card. I keep an 80GB SSD out of the pool for running a few Minecraft servers. This DIY NAS is interesting, but it would be far cheaper for me to just replace some of my smaller drives with 4 TB models if I need more storage.

    Since WHS 2011 is Windows 7 based - it should still last a while - I don;t see a need to replace it anytime soon. But my upgrade path will probably be Windows 10 + Stablebit drive pool. Cheap and flexible.

  • DanNeely - Monday, August 10, 2015 - link

    WHS 2011 is a pure consumer product (and based on a a server version of windows not win7); meaning it only has a 5 year supported life cycle. After April 2016, it's over and no more patches will be issued.
  • Navvie - Tuesday, August 11, 2015 - link

    I agree. Not being able to expand vdevs easily is a limitation. But weighing the pros and cons, it's a small price to pay.

    The last time I filled a vdev, I bought more drive and created an additional vdev.
  • BillyONeal - Monday, August 10, 2015 - link

    If you want to pay the premium for hardware that can run Solaris nobody's stopping you.
  • ZeDestructor - Monday, August 10, 2015 - link

    ZFS is available on both FreeBSD and Linux, so it's no more expensive than boring old softraid on Linux.
  • bsd228 - Friday, August 14, 2015 - link

    What premium? I've run Solaris on many intel and amd motherboards, but most recently with the HP Microserver line (34L, 50L, 54L).
  • digitalgriffin - Monday, August 10, 2015 - link

    These are good articles. And for someone with a serious NAS requirement they are useful.
    But 99% of home users don't need a NAS
    The 1% of us that do, only 1% need 8 bays with a $200 case and slow $400 intel board. That's a serious game system start up with at least 6 SATA connection motherboard.

    For example Cooler Master HAF912 will hold over 8 drives and is $50.
    6 SATA port motherboard 1150 socket mb $120.
    3.2GHz i-3 (low power processor Y or T version for $130)
    PCIe SATA card $50.

    Lets see you build a build a budget system that can:
    Handle 5 drives (boot/cache, Raid 6 (two drives + 2 parity))
    Handle transcoding with Plex server.

Log in

Don't have an account? Sign up now