Soon before Windows 8 tablets became available I wrote about my selection process, focusing on some of the key decisions that helped narrow my choices. This was largely a consideration of WindowsRT on ARM vs. Windows on Atom vs. Windows on i-Series processors. My first few weeks with this device have been a mixed bag. I’ve now returned the tablet, replacing it with a Netbook. I can’t say I saw any of this coming, so I thought it might be good to write about the issues I faced between the time that I decided on a Samsung ATIV Smart PC Pro and when I finally returned it after eight weeks. I’ll also revisit my criteria with some hands-on experience under my belt and consider how Ultrabooks/Netbooks with touch compare to ARM/Atom tablets for price/functionality/components, and how Windows 8 itself is disruptive to hardware refresh patterns. Although this post roams a bit, I hope it’s joined up by some common threads of unexpected/disruptive effects of Windows 8.
Late last year my colleagues and I tried to distil the tasks that impede SharePoint developer productivity. Then I ran those tests on EC2, Hyper-V and VMware Workstation, with the latter two virtualisation technologies running on a desktop, an older laptop and a newer laptop. In this post I hope to shed a bit of light on some follow-up testing that I’ve squeezed in to the odd hour here and there over the last six months. Unfortunately hardware availability and my schedule have not aligned to produce a further round of comprehensive tests and since I can’t see that occurring in the immediate future I’m going to fill in some gaps here with a couple of additional concrete findings, particularly regarding i5 vs. i7 testing and the impact of SSD on first page load times after application pool recycles. I’ll also talk less rigorously about a few related issues.
Drum roll please! At long last, I bring you the results of a great deal of testing. Here’s the background:
- SharePoint Development Productivity and Virtualisation Technologies
- SharePoint 2010 Development Environment Performance Tests
I’ve said my preamble in those posts, so I’ll cut to the chase here.
As I indicated in my last post, I’ve been plundering the depths of SharePoint development productivity in recent months. Understanding the context established in that post is pretty essential to understanding what follows here. In a nutshell, I’m trying to improve system performance for current users of our SharePoint development environment. This is not as simple as examining the Windows Experience Index on a number of laptop models. I needed to consult with our users to identify which tasks are slow for them and devise tests that would allow me to measure system performance on different physical and virtual systems. In this post I will describe the systems, the tests and the testing process before reviewing the results.
The 21 tests that we settled on were the result of discussions with a number of the core developers, consultants and architects at Content and Code, plus a few tests that I threw in to confirm/disconfirm some of my suppositions, such as the impact of the User Profile Service Connection on first page load time. All 21 tests were run three times for each permutation of hardware candidate and virtualisation technology. We also tested on Amazon EC2. I will discuss the testing process in more detail in a moment.