Running development from a RAM disk – options and products

In my post about ditching the Solid State Drive in favor of the RAM disk, I mentioned the speed increases.  Taking away the IO bottleneck is significant, and it can let us turn our attention to other bottlenecks that remain.  Here, I am going to outline what I’m currently using, what I’ve tried, and some steps to get it all working.

Workstation specs

If your Windows OS is already paging to disk because of a lack of RAM, then using a ram disk doesn’t make much sense.  You must have unused RAM.  I tried Vista SuperFetch for almost a year, and while it did fill up all available RAM, I still didn’t see the machine fly.  Allocating  a RAM disk with my surplus of RAM did make the machine move much quicker.  I’ve looked at SuperCache and SuperVolume from SuperSpeed, but those products are only offered for Windows XP.  They seem very promising because SuperCache does a delayed write on the entire boot drive and uses RAM as the primary IO resource. 

My workstation (and that of all of my employees) is:  Dell Precision M4300 laptop (2.x GHz dual core proc, 8 GB RAM, Seagate 7200 hard drives)

I was able to get 4GB RAM sticks from memory-up.com for less than $160 each.

Products

I tried out SuperSpeed RamDisk Plus as well as DataRam RAMDisk.  Both products can support the size of the RAM disk that I need (3GB).  Both products automatically persist the contents of the RAM disk to the hard drive upon shutdown and restart, so there is a seamless experience with both.  Because I’m using these in a laptop scenario, a sudden power outage is unlikely, and all the stored files are working copies of Subversion, where commits happen multiple times per day.

Decision

Ultimately, I’m currently running DataRam RAMDisk not because I see it as materially superior to SuperSpeed’s product, but because it gives me what I need for a lower investment.  RamDisk Plus gives me much more than I need; therefore, I would be paying for unneeded features.  I used both, and both are easy to use.  RamDisk Plus is about $100 for a Vista x64 license, and DataRam’s RAMDisk is free for the size I need, which is <= 4GB.  You have to pay for larger.

Ramdisk Screenshot

With DataRam RAMDisk, you can format it yourself, so that’s what I did by formatting it as NTFS with Disk Manager.  You can see that I’ve mapped a 3GB RAM disk to the R: drive.

image

I have also mounted it inside the C:\ drive so that I can access it quickly from WINDOWS+R:

image

I did this through the Disk Manager’s mount points:

image

My final step was to set SQL Server to use the RAM disk as its disk for newly-created databases.  Our line-of-business applications are IO intensive including database interaction, and the automated tests are especially so.

 image

All-in-all, the setup is pretty simple regardless of which RAM disk product is used.  I love the speed improvement it has given the workstations.  Obviously running less stuff in an automated build will make it run faster, but there are some things that just MUST be run in a first-line build, and that build must remain fast.  Besides the build, even compilation and working inside Visual Studio is quicker because all of the files are in RAM.


Trackbacks

Interesting Finds: 2009 07.01 ~ 07.05 Posted on 7.04.2009 at 7:46 PM

Web Not blocking the UI in tight JavaScript loops 10 Impressive JavaScript Animation Frameworks Playing

Weekly Web Nuggets #70 Posted on 7.05.2009 at 10:02 PM

Pick of the week: All Abstractions Are Failed Abstractions General Understanding Expression Trees : Marcin Kasprzykowski takes a look at expression trees, which were introduced in C# 3.0. One Public Type Per File : Chris Eargle examines a few of the many

Comments

Stephen Patten said on 6.30.2009 at 10:51 PM

Jeffrey,

Do you also have the source files for the solution in the RAM drive?, if not would you mind explaining how you managed to speed up the compilation.

Thank you,

Stephen

Stephen Patten said on 7.01.2009 at 11:52 AM

OK, re-read the article, the source files for Visual Studio are also on the mounter ram drive... On to my own testing.

Thanks again for the great article Jeffrey!

Regards,

Stephen

Michael Tobias said on 7.04.2009 at 6:14 AM

I read this with interest because my experience of Ramdrive Plus is different from yours.

Running on Windows 2003 server you CANNOT "automatically persist the contents of the RAM disk to the hard drive upon shutdown and restart"

According to their support team if you want to automatically load a ramdrive on server reboot you have to use their command line utlity mountvol to load an EMPTY ramdrive. You then have to copy your files back over to that ramdrive. You CANNOT automatically reload an existing image on reboot.

So, if you have found a way to do this I would kove to hear from you because I am ripping my hair out as copying 20Gb+ of data every time the server reboots is simply not an option!

Michael

Jeffrey Palermo said on 7.04.2009 at 2:43 PM

@Michael,

I'm doing this on Vista x64 and Windows7 x64

Andy said on 7.16.2009 at 6:04 AM

Please, how do you force Visual Studio to use RAM drive for temp files during compilation and debug?

Howard van Rooijen said on 7.16.2009 at 2:52 PM

Looks like I don't need to write part 2 of my "Speeding Up your Desktop Build" post:

blogs.conchango.com/.../speeding-up-you

One change I made moving over to a RAMDisk was to set Visual Studio's temp directory to point to the RamDisk, via the web.config of the solution you are building - add / change the following:

<system.web>

<compilation tempDirectory="RamDisk:\Temp"/>

Mike T (another one) said on 7.21.2009 at 11:41 AM

I was doing this back in the day with Windows 3.1 and Paradox for Windows. (That was when 16MB of RAM cost $500) Any time you ran a query, Paradox would save the results in an Answer.db on the disk, which I pointed to my RAM disk. Huge difference in performance, but I guess that is all relative... :)

zvolkov said on 10.14.2009 at 4:09 PM

This worked perfectly for me, cutting my integration tests from 1+ hour down to 10 minutes. I used symbolink links to map the most I/O critical folders on my C: drive to my R: drive. Thanks Jeffrey!

Jeffrey Palermo said on 10.14.2009 at 4:52 PM

@zvolkov,

Glad to help.