ELCOM: More Microsoft 2008 Stack testing results


We've done a little more testing of our product on the Microsoft 2008 stack and I'm pleased (and a little worried) with the results.

The pleasing part: we are now showing results on the 2008 stack as being 16X faster than the 2003/2005 stack.

The worrying part: results of 16X faster are very high – what if we've tested something incorrectly…

So, the goal of this post is to explain what we did and how we measured the results. Then, I'm after your feedback on what else we should be checking.



First, the background.

In January we moved our Elcom web site over to the 2008 stack. That is, we recompiled our entire Community Manager application to target the .NET 3.5 Framework, moved to SQL Server 2008 CTP5 as the back-end and hosted it on Windows 2008 Server RC0. We immediately noticed that the site was much quicker. Browsing the site was faster and managing the content (through all the web based administration tools was very snappy). We then did some basic testing by launching 1,000 concurrent requests and measuring the delivery times. You can read the full results on Alan's blog, but in summary, the 2008 stack was 5 times faster. Impressive stuff.

We always wanted to do some more detailed testing, but it wasn't until Windows Server 2008 finally RTMed two weeks ago that we started testing again. This time round the techies wanted to beef up the testing a little.

Latest testing

Whereas our initial testing had been a simple comparison between Windows 2003 Server, SQL 2005 and .NET 2.0 versus Windows 2008, SQL 2008 and .NET 3.5, this time Alan set up testing for all 8 scenarios so that we could isolate which components of the 2008 stack were contributing the most. eg is Windows, SQL or .NET that is giving the improvement.

He also changed the testing method. This time, instead of just firing 1,000 requests he kicked of a loop of 1,000 sets of 20 concurrent requests. This is a more realistic scenario for what our sites experience. He fired this via a wget based Linux bash script.

The machines he used are as follows:

2008 stack machine: Intel Core2 Dual core 2.12 GHz, 2GB RAM, 160GB Seagate SATA drive.

Testing (Linux) machine: Intel P4 3.0 GHz, 1GB RAM, 80GB Seagate SATA drive [ie This machine is acting as a client]

As you might notice, the 2008 stack machine was acting as both the IIS and SQL server. Although not a production architecture, this was deliberate to ensure that network affects were minimised.


Here's the results so far:

Windows .NET SQL Duration (h:mm:ss) X faster
2003 2.0 2005 3:37:59 – baseline –
2003 2.0 2008 CTP5 not yet  
2003 3.5 2005 3:59:14 0.9
2003 3.5 2008 CTP5 not yet  
2008 2.0 2005 0:13:55 15.6
2008 2.0 2008 CTP5 0:14:17 15.2
2008 3.5 2005 0:13:43 15.9
2008 3.5 2008 CTP5 0:13:33 16.1

A few observations:

  1. We haven't finished all our Windows 2003 testing yet. They take almost 4 hours to run, and the guys are fitting this testing all in amongst their normal daily work :-)
  2. The main improvement is (obviously) due to Windows Server 2008, and IIS7
  3. Running the 3.5 Framework on 2003 with SQL 2005 is actually slower than with 2.0
  4. Some of the other results are also puzzling… for example:
  5. On Windows 2008, using SQL 2008 is slower than SQL 2005 when we target 2.0 but faster when we target 3.5
  6. Happily, the best result is achieved when using the full 2008 stack

If I put the results another way, we can see that adding 2008 components (Server, then .NET, then SQL) has incremental improvements eg

Windows .NET SQL Duration (h:mm:ss) % increase
2003 2.0 2005 3:37:59 – baseline –
2008 2.0 2005 0:13:55 1566%
2008 3.5 2005 0:13:43 1589%
2008 3.5 2008 CTP5 0:13:33 1609%

But there's no escaping that the major improvements are all thanks to Windows 2008 Server. In general Windows 2008 is much faster, and our techies have already switched to using it as their desktop OS.


Some questions. I'm trying to look at our testing as critically as possible. After all, 1600% improvement is something not to be taken lightly, and there will likely be a few people reviewing these results. So, I want to be clear on where we stand.

The first thing to state is that we are not a certified testing lab. So, this testing has been conducted on simple server hardware, and with tests that are geared to how we operate. We didn't have any of our big rack servers available for testing, so the results are pretty 'real world'.

Q: Why does it take almost 4 hours to do 20,000 requests on Windows 2003? Something must be wrong…

A: Yes, this concerned me as well. But here's what I didn't mention earlier. Each request we submit is returning our home page, plus graphics, plus all links from that page and any graphics and documents from those pages as well (ie wget with the u directive is basically spidering from the page). It turns out that each request is returning 6.4MB of data. Thus, 20,000 request in total is returning about 120GB. This is to emulate load testing the server. With Windows 2003 this takes over 3 and a half hours.

Q: Did you test on a smaller page?

A: Yes, we also performed testing on a landing page that returned 70KB. The results were much quicker, but in the same proportions.

Reducing page size down further to very small (eg an empty page, but still delivered by our Community Manager product) brought the comparisons closer, with Windows 2008 being only 4X quicker than Windows 2003. The 1,000 X 20 concurrent requests took 51 seconds on Windows 2003, but only 12 seconds on Windows 2008.

Q: Are the actual durations meaningful?

A: No, the durations on their own are not that meaningful, since changing hardware will obviously affect the results. What is meaningful is comparing the durations between the different scenarios.

Q: Did you try on different hardware?

A: Yes, we've managed to get our hands on a beefier machine (Quad Core 3.5 GHz, 8GB RAM) for a few days. Initial testing has shown quicker results on Windows 2003, but in the same order of magnitude (ie hours). Full results from that machine will be available in the next few weeks.

Q: How many times did you run these tests?

A: Most tests have been run a few times as the techies were setting them up and performing initial loops. All the test results are from testing scenarios run by Alan. Brad was then asked to run a few of the tests again independently. He verified the baseline 2003 Server result and the first of the 2008 Server results (ie with 2.0 and SQL 2005)

The results above are based on the last and most thorough pass (by Alan). Thus, it is the actual result, not an average of repeated tests. Ideally multiple results would be better, and as time permits, we'll re-run all the tests, and provide averages.

Q: Did you perform any optimisation of the 2008 setup?

A: No, all test scenarios were conducted on freshly installed, out-of-the-box setups.

Q: Did you try optimising the Windows 2003 setup?

A: Yes, we did. The techies tried different combinations of turning on compression, increasing memory, CPU monitoring, tweaking application pools, and other memory changes. In all cases there was very minimal change in Windows 2003 performance. The results tabled above are based on the untweaked out-of-the-box installation.

Q: What version of Windows Server is the Elcom web site running on?

A: The actual live Elcom web site is still on Windows 2008 RC0. We aim to cut over to the RTM version once our SPLA license is finalised… 2008 SPLA licensing should be available this month (fingers crossed).

Q: Are you using any .NET 3.5 specific features in these scenarios?

A: No, we've made sure that the code base is exactly the same, the only change has been the framework version targeted during compile (and of course the IIS settings).

Next steps for Elcom

So, where to from here?

We'll be moving a few of our client sites over to Windows 2008 hosting in the coming weeks (assuming licensing is all sorted). A few of our staging servers are already in transition. And from April onwards we plan to move all our hosting onto Windows 2008.image [We'll still be supporting clients who host on Windows 2003 of course.]

Next, we are upgrading clients to .NET 3.5.

Our next Community Manager.NET release (in mid March) will be available in both 2.0 and 3.5 versions. The 3.5 version (which we are referring to as Community Manager 2008) will be essentially the same code base, with only a few new 3.5 features used for off-line services (the new System.AddIn stuff).

Our code base will be branched into the two versions for approximately 3 months until we've completed our testing and moved all clients over to .NET 3.5. After that we will return to having the one code base (ie just targeting 3.5).

SQL Server 2008 is still too early to make plans for. Whilst we will continue testing with it (especially with CTP6 just released), we won't be moving any production systems (other than our own web site) over to it until much later in the year.

Over to you

There's our situation. I hope it is helpful (or at least of some interest).

I'm keen to answer any questions, hear any criticisms and take on board any suggestions.

Please contact me via any method on my Contact page.

Add comment

By Craig Bailey