Running development from a RAM disk – options and products

In my post about ditching the Solid State Drive in favor of the RAM disk, I mentioned the speed increases.  Taking away the IO bottleneck is significant, and it can let us turn our attention to other bottlenecks that remain.  Here, I am going to outline what I’m currently using, what I’ve tried, and some steps to get it all working.

Workstation specs

If your Windows OS is already paging to disk because of a lack of RAM, then using a ram disk doesn’t make much sense.  You must have unused RAM.  I tried Vista SuperFetch for almost a year, and while it did fill up all available RAM, I still didn’t see the machine fly.  Allocating  a RAM disk with my surplus of RAM did make the machine move much quicker.  I’ve looked at SuperCache and SuperVolume from SuperSpeed, but those products are only offered for Windows XP.  They seem very promising because SuperCache does a delayed write on the entire boot drive and uses RAM as the primary IO resource. 

My workstation (and that of all of my employees) is:  Dell Precision M4300 laptop (2.x GHz dual core proc, 8 GB RAM, Seagate 7200 hard drives)

I was able to get 4GB RAM sticks from for less than $160 each.


I tried out SuperSpeed RamDisk Plus as well as DataRam RAMDisk.  Both products can support the size of the RAM disk that I need (3GB).  Both products automatically persist the contents of the RAM disk to the hard drive upon shutdown and restart, so there is a seamless experience with both.  Because I’m using these in a laptop scenario, a sudden power outage is unlikely, and all the stored files are working copies of Subversion, where commits happen multiple times per day.


Ultimately, I’m currently running DataRam RAMDisk not because I see it as materially superior to SuperSpeed’s product, but because it gives me what I need for a lower investment.  RamDisk Plus gives me much more than I need; therefore, I would be paying for unneeded features.  I used both, and both are easy to use.  RamDisk Plus is about $100 for a Vista x64 license, and DataRam’s RAMDisk is free for the size I need, which is <= 4GB.  You have to pay for larger.

Ramdisk Screenshot

With DataRam RAMDisk, you can format it yourself, so that’s what I did by formatting it as NTFS with Disk Manager.  You can see that I’ve mapped a 3GB RAM disk to the R: drive.


I have also mounted it inside the C: drive so that I can access it quickly from WINDOWS+R:


I did this through the Disk Manager’s mount points:


My final step was to set SQL Server to use the RAM disk as its disk for newly-created databases.  Our line-of-business applications are IO intensive including database interaction, and the automated tests are especially so.


All-in-all, the setup is pretty simple regardless of which RAM disk product is used.  I love the speed improvement it has given the workstations.  Obviously running less stuff in an automated build will make it run faster, but there are some things that just MUST be run in a first-line build, and that build must remain fast.  Besides the build, even compilation and working inside Visual Studio is quicker because all of the files are in RAM.

The ASP.NET MVC ActionController – The controllerless action, or actionless controller

There has been quite  a bit of discussion about how controllers are really namespaces trying to get out once you use the concept on a nontrivial application. 

Brian Donahue’s post on The anti-controller revolution prompted me to do this little experiment.  He references some twitter posts by Jimmy Bogard, one of my esteemed consultants at Headspring SystemsChad Myers also has opined about the notion of more independent actions and has cited precedence.

My interest in this space is purely practical.  I really don’t care how patterns are published.  I don’t care about “being true” to the MVC pattern or any other pattern.  I’m more interested in being effective with web applications on .Net.  After having experience with MvcContrib, CodeCampServer, and a much larger ASP.NET MVC implementation (200+ screens), I have come to see how controllers end up searching for an identity.  What is a ProductController anyway?  That’s just about as specific as classes called ProductService, ProductManager, ProductUtility, etc. 


You can SVN checkout the following url to see my spike code:  You can get it in zip file format here. (I repeat these links below)

In the default ASP.NET MVC project template, there is a HomeController, and then there is an AccountController that hooks up the ASP.NET MembershipProvider.  The AccountController does registering, logging in/out, changing password, and it is WAY too big.  The AccountController lacks cohesion.  The AccountController has more than one reason to change.  Each of the actions seem like they are more cohesive.

I’m going to narrate a before and an after of the ASP.NET MVC default project as I refactored it into being a ActionController-based application.  The controller names were promoted to namespaces and the action names were promoted to controller names.  The requests for a GET and POST of the same url are handled by the same ActionController since the action name is the same.  There are two methods in the class; one that handles GET and one that handles POST.  Within the ActionController, the methods are named “execute” since the name of the action is in the class name.  View structure stayed the same.  Seems there isn’t much pain in the view structure.


Let’s look at the possible URLs in the default project:

  • / (GET)
  • /Home/About (GET)
  • /Account/LogOn (GET)
  • /Account/LogOn (POST)
  • /Account/Register (GET)
  • /Account/Register (POST)
  • /Account/ChangePassword (GET)
  • /Account/ChangePassword (POST)
  • /Account/ChangePasswordSuccess (GET)

Throughout my refactoring, the urls do not change.  The routes do not change.  The only thing that changes is that the controllers are broken up into multiple classes along action lines.  For instance, There are two LogOn actions.  One for the form rendering, and one to accept the post.  These two are cohesive together, but they are not cohesive when combined with register, like they are by default with the AccountController.


image  Let’s start at the beginning.  To the top (MvcApplication2) is the default project with no modification.  You can check out the code yourself.  The HomeController is pretty easy to dissect, but the Account controller is responsible for 7 independent requests.  5 too many, I think. 


imageTo the top(MvcApplication1), we have what the project ended up looking like after the refactoring.

You can see that the actions from the AccountController were promoted to be controllers. 

Let’s take a look at the LogOnController.  I have pushed the two Execute methods to the top for clarity.  With ActionControllers, the controller is only concerned about one action.  In this case, the GET pass of the action renders a form, and the POST pass of the action modifies some server state.  Here is the code:

Sample ActionController

namespace MvcApplication1.Controllers.Account
public class LogOnController : ActionController
public ActionResult Execute()

return View();

[System.Diagnostics.CodeAnalysis.SuppressMessage("Microsoft.Design", "CA1054:UriParametersShouldNotBeStrings",
Justification = "Needs to take same parameter type as Controller.Redirect()")]
public ActionResult Execute(string userName, string password, bool rememberMe, string returnUrl)

if (!ValidateLogOn(userName, password))
return View();

FormsAuth.SignIn(userName, rememberMe);
if (!String.IsNullOrEmpty(returnUrl))
return Redirect(returnUrl);
return RedirectToAction("", "Home");

public LogOnController()
: this(null, null)

// This constructor is not used by the MVC framework but is instead provided for ease
// of unit testing this type. See the comments at the end of this file for more
// information.

public LogOnController(IFormsAuthentication formsAuth, IMembershipService service)
FormsAuth = formsAuth ?? new FormsAuthenticationService();
MembershipService = service ?? new AccountMembershipService();

public IFormsAuthentication FormsAuth
private set;

public IMembershipService MembershipService
private set;

private bool ValidateLogOn(string userName, string password)
if (String.IsNullOrEmpty(userName))
ModelState.AddModelError("username", "You must specify a username.");
if (String.IsNullOrEmpty(password))
ModelState.AddModelError("password", "You must specify a password.");
if (!MembershipService.ValidateUser(userName, password))
ModelState.AddModelError("_FORM", "The username or password provided is incorrect.");

return ModelState.IsValid;


What’s at the heart of this, you might think?  There are two things:

  1. A Custom controller factory (to find the right ActionController)
  2. A controller base class, “ActionController”


I spiked out an implementation of the ActionController.  This is completely non-vetted in a real environment, but the sample project is available for download here. (  This includes an ASP.NET MVC project with the ActionController class.  You can SVN co here:

Please download the code and check it out.  What it ends up doing for the actions is groups them cohesively and then the concept of the “controller” becomes a namespace.  The controller factory needs work to be able to locate ActionControllers that are unique within the controller namespace but not unique throughout the project.  This is a rough first pass that I did in 30 minutes.

I’m not sure if this is what I’ll commit to MvcContrib for more widespread consumption, but I (and my teams) are feeling a bit of pain with bloated controllers, so it’s worth considering.  What I like most about this approach is that the only thing that changed was the controllers.  The routes don’t change.  The view folder structure doesn’t change.  The Html helpers don’t change.  We merely refer to the concept of a controller as a namespace rather than a class.  We now refer to an action as a class instead of a method.

Death of the professional speaker? Will never happen

I was listening to Ted Neward on Hanselminutes from the NDC, and he mentions that local conferences are taking away from professional conferences like VSLive and DevConnections.   His hypothesis is that if these professional conferences go away, then capable speakers will not have the incentive to go around teaching.

Scott Bellware aptly countered that the South by Southwest conference has thousands of enthusiastic attendees and is getting stronger every year.  His notion is that the conference must be compelling.

Whether you call it Capitalism or Darwinism, I agree with Scott.  It is up to the conference to offer a compelling experience.  The conference is selling a product, and the product has to have a value proposition.  If struggling conferences can’t provide a value proposition, then they will simply lose market share to other conferences that do.

I don’t think death will ever come to the professional speaker.  The conference venues may change, but it will change as a reaction to changes in the market.  I became a professional speaker in 2007, and I am confident that I will always find compelling conference venues in which to participate.

Manning offers Alt.Net book series and 42% discount on them all

Manning just sent out “alt42” as a discount code for the Alt.Net books until June 25th.  Along with my book, ASP.NET MVC in Action, you can use this discount code on others such as:

Upgrade TortoiseSVN, switch shortcut key for OK

Since this caught me a little by surprise, it probably will catch someone else.  Here’s a tip:  The shortcut key for the “OK” button is:



When on the commit dialog, I often only use the keyboard.  I try to stay away from the mouse as much as possible.  I type the commit message, toggle ALT+A for selecting all files that were chanted (to get non-versioned files as well), and then I was so used to hitting ALT+O because the “O” in “OK” was a hot letter. 

I upgraded to TortoiseSVN 1.6.2, and ALT+O didn’t work anymore.  It may have been in the release notes, who knows, but I found out, quite by accident that CTRL+ENTER keeps my fingers on the keyboard.  I’m sure one of you commenting will let me know that this is some old, universally-known truth, but it’s news to me.  In case it helps someone else, here it is. . . and here is the version of TortoiseSVN I’m on now.


Bytes by MSDN mini vidcasts available

MSDN has started a mini video series called “Bytes by MSDN” and has the first video online.  They are pretty short and are perfect for playing on your smartphone, iphone, video ipod, or. . . . . . uh. . . zune.

Scott Hanselman has the first interview.  If you subscribe to the feed, you see my video come through on 27 August, 2009.

Here is the schedule:


Bytes by MSDN Schedule

June 11	Scott Hanselman
June 18	Billy Hollis
June 25	Kate Gregory
July 2	Richard Campbell
July 9	Stephen Forte & Clemens Vasters
July 16	Tim Huckaby & Michele Leroux Bustamente
July 23	Jim Wilt & Brian Noyes
July 30	Loke Uei Tan
Aug 6	Matt Hessinger
Aug 13	Don Box
Aug 20	Juval Lowy
Aug 27	Jeffrey Palermo
Sept 3	Tim Heuer & Out Takes

Speeding up the build – ditch the SSD and go for the RAM drive

Over the past few months, I’ve been searching for the best and most cost-effective way to speed up the build regardless of which projects my guys are working on.  We have a range of projects that see build times from 40 seconds on up to 10 minutes.  At the upper end of the scale, waiting 10 minutes for a build of the software starts to become painful.  I understand that at a larger scale, build systems can become very complex, and we will have to take that leap to the next level of complexity at some point.  I am trying to delay absorbing that complexity as long as possible.  In order to keep it working as a single build, I need to find a way to make the build run faster.

Here are some of the things that run as a part of the Continuous Integration build (it runs on each developer workstation as well as the CI server):

  • Creates a test database
  • Poke xml configuration files with build settings
  • MSBuild compile
  • NUnit running all unit tests
  • NUnit running all fast integration tests

For the purposes of this post, I’ll use CodeCampServer as a demonstration vehicle.  CodeCampServer uses some of the concepts of Head’spring’s architecture, and the build uses our standard NAnt template.

Running the build “Click To Build.bat” from the root of the trunk working copy, we see that it takes 43.9 seconds to run.


I tried a Solid State Drive.  In fact, I tried two.  I tried the Patriot Memory WARP 128GB SSD (SATA) as well as the Intel X-25 80GB SSD (SATA).  The Patriot, in my opinion, is a piece of junk!  It benchmarks fast, but under real usage, where the build system (and Visual Studio) are hammering it all the time with small reads and write, it just chokes.  It slows down Windows to a near crawl.  I don’t have any numbers to share with you, but I’m pretty disgusted with that drive.  I had replaced the Seagate 7200RPM 120GB drive that came with my Dell Precision M4300 laptop

Next, I tried the Intel X-25 80GB SSD.  This drive is pretty good, but I don’t see a big difference between it and the 7200RPM Seagate.  Power consumption is less, and the heat is less, but I did not feel a big performance boost.  Again, no numbers because I’m not driving toward incremental improvement.  I need a revolutionary improvement in order to feel comfortable doing a hardware upgrade across the board for all of my employees.  I want to upgrade the hardware, but I want to be sure there is a firm ROI story for it. 

I have gone back to the Seagate 7200RPM drive, but I did the following.  I upgraded the laptop to 8GB of RAM, and I installed SuperSpeed Ramdisk Plus.  I configured a 3GB RAM disk and moved my TortoiseSVN working copies to the ram disk.  I also configured SQL Server to use the ram disk as the default location for databases.



This is important because during one of our software builds, the fast integration tests consume the majority of the build time, and much of that is testing queries and data access.  The log file and the data file will now reside in RAM inside the ram disk.  The .Net framework, MSBuild, and other stuff still resides on the spindle drive, but much of the disk access has now been moved to RAM, which registers over 2000 MB/s on the read side with a 0ms access time.


The SuperSpeed Ram Disk Plus product runs about $100 for Vista x64, and it has a nice feature that copies the contents of the ram disk to the main boot drive while shutting down.  It will reload on boot.  This is perfect for version-controlled software source code because the risk of loss is low because of the frequent integration with the version control system.

Let’s cut to the chase.  What do these changes net for the CodeCampServer build shown earlier?


That’s a 23% savings off the build time.  As a build approaches 10 minutes, it become very critical to reduce the build time.  A slow build will not be run frequently.  A build that’s not run frequently won’t have the opportunity to alert the team to problems right away.  In an extreme programming project, the build is essential to keeping the team going at a fast pace.  I have used CodeCampServer as an example, but with longer builds, the testing step becomes the majority of the time.  Integration tests with the database and file system start to take over 75% of the overall build time, and if we can reduce the time it takes for disk access, then database performance increases as well.  On a larger project that I can’t share, the build time went from 10 minutes down to 6.  That’s a 40% savings.

We’ve also tried running data access tests against an in-memory SQL Lite, but we can’t go with that solution for all projects, and I want an approach that is portable.  If I can find a way to run SQL Server in a memory-only mode, that might help even more.  For now, it seems like beefed-up RAM and a RAM disk give some pretty substantial returns.

I really though that Solid-State drives would be the ticket to faster builds.  With a system that was largely IO-bound, I was counting on SSDs to come to the rescue.  I think they eventually will get better, but it is still too early.  Spindle drives are so mature.  The drivers, firmware, and the drive cache all make them pretty snappy.  The two SSDs I tried did not blow me away.  For now, I’m going to bump up RAM and investigate competing RAM drive products.

By the way, if any of you readers have good experience with RAM drive products, please leave a comment.

Monthly Afternoon Workshops – topics, please (free Headspring events)

This post is an example of free afternoon workshops that we are putting on from Headspring Systems.  This is example of how we find it valuable to give back to the local community here in Austin, TX.

My plans for the free afternoon workshops are to hold one per month.  Philip Wheat, from Microsoft, has been extremely supportive.  He is helping us book the classroom at the Microsoft office so that these can continue on a regular basis.  These events are completely free, and all 4 hours will be used for teaching the topic.  No sales pitches, just learning.

We have scheduled two of these workshops on the topic of ASP.NET MVC.  What I need to find out from you folks, who live in or are willing to travel to Austin, is what topics should be on the calendar.  For instance, ASP.NET MVC is hot, and we have specific expertise on the topic.  As the CTO of Headspring, I am tapping into many of the consultants to provide the topics that are most relevant. 

Some of the topics that have been requested are:

  • NHibernate
  • Inversion of Control (IoC)
  • Test-Driven Development
  • Domain-Driven Design
  • Automated Builds

I need some help from you.  Are any of these topics interesting?  Are there other topics that would be good workshop topics?  Should the workshops be confined to programming topics or should we move into software management topics?  Please leave a comment telling me what you think.  This is one avenue where we are giving back to the community, and I want to ensure it has the biggest impact possible.