Developers driving on ice

imageToday, all the schools are out, and it is a good old “ice day” in Austin, TX.  For northerners, know that Austin doesn’t really have many plows or equipment to speak of to combat this because it happens only every two years. 

If you are from a northern state, or have driving in Colorado to go skiing, you might have experience driving in icy conditions.  No car _really_ does well driving on ice, but the point is not to drive on ice.  The point is to _not_ drive on the ice.  And if you have never done it before, you have no idea what to expect.  Without the past experience, you don’t know how to be prepared for the encounter, what to avoid, or how to handle it – and when to completely avoid it altogether.

Coding on ice

The same is true every day when engineering software.  Because this is such a new profession, we have a short supply of experienced software engineers who have been through the tough challenges before.  Because of the growth in the industry, companies are forced to hire developers who have executed a few projects but lack the experience in the wide range of situation that can occur in a software engineering project, or the many years of production operations in the life of a software system.

Without the prior experience of being in a certain situation before, developers don’t know what to expect, and have to figure out on-the-fly how to handle a new situation.

The point

I don’t pretend to have experienced everything that is possible in the software world.  Few people could, and I continually turn to Fred Brooks (teaching link) for his timeless wisdom in this area.

One particularly hairy situation can be integration with other systems that were built before widespread networks were common.  These systems are very difficult to deal with, and if one has started one’s career with websites and easy-to-use web services, these can catch one by surprise.


This is just a short post to reflect on the similarities drawn from something as simple as driving on ice and creating software in an unfamiliar situation.  Without past experience to draw from, we can get turned around, or find ourselves off the road.

Drawing from others’ experience is a good move.  Reading the works of others in an area, educating ourselves, etc.  And the best option is to find something who has been through the challenge before so that you don’t have to go through the jungle yourself.  There is no shame is asking for help and admitting that you’ve never dealt with a problem quite like this before.  It’s liberating to be able to say “I don’t know”, or “I haven’t done anything like this before”.  There is no software engineer in the world who had seen everything.  And the more I learn, the more I discover just how much I have yet to learn.

My current preferred continuous integration build script–psake

I first learned continuous integration and build script principles from Steve Donie back in the last decade.  I’m eternally grateful.  To this day, the basic outline of the build scripts I deploy today have the same general flow that he taught me and implemented using NAnt and

Today, we look back at the practice of forcing XML into a procedural programming language and chuckle at how naïve we were as an industry.  Now, we use Powershell on the Windows platform for modern scripting.  It was a brilliant move for James Kovacs to essentially port the build script concepts to powershell with the psake library.

I’ve been speaking on setting up basic software configuration management (SCM) at conferences and user groups for years, and I try to maintain an “Iteration Zero” Visual Studio solution template that includes scripting and the structure necessary for continuous integration right out of the gate.  This build script is the one from that project, and it’s the template I use for every new software system.  It’s been modified a bit over the years.  It came from the one I had used for CodeCampServer back in the day, and it, of course, is used in every project we do at our software engineering practice at Clear Measure

The full file can be found here.

# required parameters :
# 	$databaseName

Framework "4.0"

properties {
    $projectName = "IterationZero"
    $unitTestAssembly = "UnitTests.dll"
    $integrationTestAssembly = "IntegrationTests.dll"
    $fullSystemTestAssembly = "FullSystemTests.dll"
    $projectConfig = "Release"
    $base_dir = resolve-path .
    $source_dir = "$base_dirsrc"
    $nunitPath = "$source_dirpackagesNUnit."
    $build_dir = "$base_dirbuild"
    $test_dir = "$build_dirtest"
    $testCopyIgnorePath = "_ReSharper"
    $package_dir = "$build_dirpackage"	
    $package_file = "$build_dirlatestVersion" + $projectName +""
    $databaseName = $projectName
    $databaseServer = "localhostsqlexpress"
    $databaseScripts = "$source_dirCoreDatabase"
    $hibernateConfig = "$source_dirhibernate.cfg.xml"
    $schemaDatabaseName = $databaseName + "_schema"
    $connection_string = "server=$databaseserver;database=$databasename;Integrated Security=true;"
    $cassini_app = 'C:Program Files (x86)Common FilesMicrosoft SharedDevServer10.0WebDev.WebServer40.EXE'
    $port = 1234
    $webapp_dir = "$source_dirUI" 

task default -depends Init, CommonAssemblyInfo, Compile, RebuildDatabase, Test, LoadData
task ci -depends Init, CommonAssemblyInfo, Compile, RebuildDatabase, Test, LoadData, Package

task Init {
    delete_file $package_file
    delete_directory $build_dir
    create_directory $test_dir
    create_directory $build_dir

task ConnectionString {
	$connection_string = "server=$databaseserver;database=$databasename;Integrated Security=true;"
	write-host "Using connection string: $connection_string"
	poke-xml $hibernateConfig "//e:property[@name = 'connection.connection_string']" $connection_string @{"e" = "urn:nhibernate-configuration-2.2"}

task Compile -depends Init {
    msbuild /t:clean /v:q /nologo /p:Configuration=$projectConfig $source_dir$projectName.sln
    delete_file $error_dir
    msbuild /t:build /v:q /nologo /p:Configuration=$projectConfig $source_dir$projectName.sln

task Test {
	copy_all_assemblies_for_test $test_dir
	exec {
		& $nunitPathnunit-console.exe $test_dir$unitTestAssembly $test_dir$integrationTestAssembly /nologo /nodots /xml=$build_dirTestResult.xml    

task RebuildDatabase -depends ConnectionString {
    exec { 
		& $base_diraliasqlaliasql.exe Rebuild $databaseServer $databaseName $databaseScripts 

task LoadData -depends ConnectionString, Compile, RebuildDatabase {
    exec { 
		& $nunitPathnunit-console.exe $test_dir$integrationTestAssembly /include=DataLoader /nologo /nodots /xml=$build_dirDataLoadResult.xml
    } "Build failed - data load failure"  

task CreateCompareSchema -depends SchemaConnectionString {
    exec { 
		& $base_diraliasqlaliasql.exe Rebuild $databaseServer $schemaDatabaseName $databaseScripts 

task SchemaConnectionString {
	$connection_string = "server=$databaseserver;database=$schemaDatabaseName;Integrated Security=true;"
	write-host "Using connection string: $connection_string"
	poke-xml $hibernateConfig "//e:property[@name = 'connection.connection_string']" $connection_string @{"e" = "urn:nhibernate-configuration-2.2"}

task CommonAssemblyInfo {
    $version = ""   
    create-commonAssemblyInfo "$version" $projectName "$source_dirCommonAssemblyInfo.cs"

task Package -depends Compile {
    delete_directory $package_dir
	#web app
    copy_website_files "$webapp_dir" "$package_dirweb" 
    copy_files "$databaseScripts" "$package_dirdatabase"
	zip_directory $package_dir $package_file 

task FullSystemTests -depends Compile, RebuildDatabase {
    copy_all_assemblies_for_test $test_dir
    &$cassini_app "/port:$port" "/path:$webapp_dir"
    & $nunitPathnunit-console-x86.exe $test_dir$fullSystemTestAssembly /framework=net-4.0 /nologo /nodots /xml=$build_dirFullSystemTestResult.xml
    exec { taskkill  /F /IM WebDev.WebServer40.EXE }
function global:zip_directory($directory,$file) {
    write-host "Zipping folder: " $test_assembly
    delete_file $file
    cd $directory
    & "$base_dir7zip7za.exe" a -mx=9 -r $file
    cd $base_dir

function global:copy_website_files($source,$destination){
    $exclude = @('*.user','*.dtd','*.tt','*.cs','*.csproj','*.orig', '*.log') 
    copy_files $source $destination $exclude
	delete_directory "$destinationobj"

function global:copy_files($source,$destination,$exclude=@()){    
    create_directory $destination
    Get-ChildItem $source -Recurse -Exclude $exclude | Copy-Item -Destination {Join-Path $destination $_.FullName.Substring($source.length)} 

function global:Copy_and_flatten ($source,$filter,$dest) {
  ls $source -filter $filter  -r | Where-Object{!$_.FullName.Contains("$testCopyIgnorePath") -and !$_.FullName.Contains("packages") }| cp -dest $dest -force

function global:copy_all_assemblies_for_test($destination){
  create_directory $destination
  Copy_and_flatten $source_dir *.exe $destination
  Copy_and_flatten $source_dir *.dll $destination
  Copy_and_flatten $source_dir *.config $destination
  Copy_and_flatten $source_dir *.xml $destination
  Copy_and_flatten $source_dir *.pdb $destination
  Copy_and_flatten $source_dir *.sql $destination
  Copy_and_flatten $source_dir *.xlsx $destination

function global:delete_file($file) {
    if($file) { remove-item $file -force -ErrorAction SilentlyContinue | out-null } 

function global:delete_directory($directory_name)
  rd $directory_name -recurse -force  -ErrorAction SilentlyContinue | out-null

function global:delete_files_in_dir($dir)
	get-childitem $dir -recurse | foreach ($_) {remove-item $_.fullname}

function global:create_directory($directory_name)
  mkdir $directory_name  -ErrorAction SilentlyContinue  | out-null

function global:create-commonAssemblyInfo($version,$applicationName,$filename)
"using System;
using System.Reflection;
using System.Runtime.InteropServices;

// <auto-generated>
//     This code was generated by a tool.
//     Runtime Version:2.0.50727.4927
//     Changes to this file may cause incorrect behavior and will be lost if
//     the code is regenerated.
// </auto-generated>

[assembly: ComVisibleAttribute(false)]
[assembly: AssemblyVersionAttribute(""$version"")]
[assembly: AssemblyFileVersionAttribute(""$version"")]
[assembly: AssemblyCopyrightAttribute(""Copyright 2010"")]
[assembly: AssemblyProductAttribute(""$applicationName"")]
[assembly: AssemblyCompanyAttribute(""Headspring"")]
[assembly: AssemblyConfigurationAttribute(""release"")]
[assembly: AssemblyInformationalVersionAttribute(""$version"")]"  | out-file $filename -encoding "ASCII"    

function script:poke-xml($filePath, $xpath, $value, $namespaces = @{}) {
    [xml] $fileXml = Get-Content $filePath
    if($namespaces -ne $null -and $namespaces.Count -gt 0) {
        $ns = New-Object Xml.XmlNamespaceManager $fileXml.NameTable
        $namespaces.GetEnumerator() | %{ $ns.AddNamespace($_.Key,$_.Value) }
        $node = $fileXml.SelectSingleNode($xpath,$ns)
    } else {
        $node = $fileXml.SelectSingleNode($xpath)
    Assert ($node -ne $null) "could not find node @ $xpath"
    if($node.NodeType -eq "Element") {
        $node.InnerText = $value
    } else {
        $node.Value = $value


AliaSQL – the new name in automated database change management

Along with this post, please make sure to read Eric Coffman’s very thorough post introducing all of his work on AliaSQL.

Way back in 2006, Kevin Hurwitz and I both worked at a start-up company focused on Sarbanes-Oxley compliance software.  While the business model didn’t quite pan out, we had a killer team, and we created some innovations that have gained widespread adoption even to this day.  Among them are:

While any artifacts from 2006 are long-gone, these tools and patterns live on to this day, and many folks have adopted these around the world.  I do have to give credit where credit is due.  In 2007, both Kevin and I were working with Eric Hexter on projects at Callaway Golf Interactive, and Eric material contributed to a large rewrite of the automated database migrations, AND, he was very involved in labeling it as “Tarantino”, honoring the famous movie producer.  And to this day, Tarantino has been widely adopted as a simple and targeted way to perform automated database migrations in continuous integration environments.

Reviewing the problem

frustration.cartoonIn many teams, source control is normal, but databases are left out in the cold.  Perhaps a massive SQL file is exported from time to time as the DDL definition of the database, but deploying database changes across test, staging and production environments is still an issue and is error-prone.

Several common mistakes exist when managing the database change management process.  The first is developers sharing a development database.  The second is developers maintaining local databases that are synced manually.

When sharing a development database, changes to this database have to be a blocking issue.  Working on branches becomes an issue because when a database change happens, at least one developer ends up working with a version of the code that has a problem with the new database change.  The external environmental change ends up wasting the time of at least one team member.

When each developer maintains isolated databases that are synced manually, the team invariably has to have a meeting in order to figure out what the needed database changes are for a given build that needs to be deployed to production.  Having a repeatable QA process is difficult here.

The problem manifests itself when production deployments happen and some database change is left out or performed differently than intended.  This can result in deployment-time troubleshooting and adhoc code-changes or database changes in a heroic effort to salvage the deployment.

The solution

The premise to automated database migrations is to have a process that executes in the exact same fashion in every successive environment so that by the time the production deployment happens, there is no chance of it not working properly.  In addition, the process should not vary based on what feature branch or hot-fix branch in source control is being worked.  And the process should scale to an unlimited number of pre-production and production environments.

One school of thought with database migrations uses a process of examining the target database and then generates appropriate scripts to be run based on the state of the target database.  In fact, Microsoft’s DACPAC in the database project works like this.  From a philosophical level, I don’t like this approach because it doesn’t allow the QA process to vet the scripts that will actually execute from environment to environment, and there is no opportunity to mix in data transforms in between multi-step table transformations, like the merging or splitting of columns.

In addition, I reject migration philosophies that believe roll backs are possible.  Perhaps roll-backs could be performed for completely reversible operations, but as soon as a migration includes a DROP COLUMN operation, a roll-back scenario is broken because there is no way to roll back and reverse the deletion of the data in the column.  In addition, once an install package has failed to properly install, how can one trust it to then faithfully do the right thing in a roll back attempt?

Introducing AliaSQL, the new simple standard in SQL-based database migrations

Right now, you can download AliaSQL (pronounced “ey-lee-us-Q-L”) from NugetEric Coffman was a Tarantino user for several years and then started encountering issues because Tarantino hadn’t been maintained in a few years.  So he forked the project.  Tarantino was great, and I, and others, poured many hours into it.  It does include much more than just database migrations, and that’s one of the reasons that a new project is warranted – to provide focus.

Tarantino still has a dependency on SQL Server 2008.  SQL Server 2012 isn’t supported, and SQL Azure has some issues.  The SQL SMO dependency was a great idea in its time, but AliaSQL does away with this dependency and achieves broad compatibility as a result.

How to upgrade from Tarantino to AliaSQL

the good news is that AliaSQL is 100% backward compatible with Tarantino database migrations.  This is absolutely intentional.  The process and philosophy of the original Tarantino (which was actually NAnt-script-based) from 2006 is preserved while taking advantage of a significant rewrite to provide more detailed logging, transaction support, and database compatibility.

If you have an application and a build script that currently uses Tarantino, I encourage you to make the simple and trivial upgrade.  You can check out my Iteration Zero sample project to see how easy it is to make the upgrade.  The recommended way to get AliaSQL.exe is from a Nuget search, but you can also directly download just the EXE here.

Then, just updated your build script (a psake build script is shown here).

 task RebuildDatabase -depends ConnectionString {
  1. exec {
-        & $base_dirtarantinoDatabaseDeployer.exe Rebuild $databaseServer $databaseName $databaseScripts
+        & $base_diraliasqlaliasql.exe Rebuild $databaseServer $databaseName $databaseScripts

.csharpcode, .csharpcode pre
font-size: small;
color: black;
font-family: consolas, “Courier New”, courier, monospace;
background-color: #ffffff;
/white-space: pre;/
.csharpcode pre { margin: 0em; }
.csharpcode .rem { color: #008000; }
.csharpcode .kwrd { color: #0000ff; }
.csharpcode .str { color: #006080; }
.csharpcode .op { color: #0000c0; }
.csharpcode .preproc { color: #cc6633; }
.csharpcode .asp { background-color: #ffff00; }
.csharpcode .html { color: #800000; }
.csharpcode .attr { color: #ff0000; }
.csharpcode .alt
background-color: #f4f4f4;
width: 100%;
margin: 0em;
.csharpcode .lnum { color: #606060; }

Notice that for an application already using Tarantino, the only change is the path to aliasql.exe.  All other major behavior is exactly the same as well.

AliaSQL differences from Tarantino

Although backward compatibility is excellent, you will immediately notice some key differences:

  • Immediate compatibility with SQL Server 2012 as well as automatic compatibility with future versions.  This was accomplished by breaking the SQL SMO dependency.
  • Transactions are added:  With Tarantino, when a script failed, the database was in an inconsistent state because transactions were not used.  AliaSQL implements transactions for each SQL file, split by GO lines.  This enables transactional rollback if something goes wrong when executing scripts.
  • New TestData command that executes scripts in a TestData folder for the purpose of SQL statements that include test data for non-production environments.
  • New Baseline command that initializes an existing database for future management by AliaSQL
  • AliaSQL Kickstarter Nuget package that creates a database-specific Visual Studio project to contain SQL scripts and provide a quick console project for the execution of AliaSQL from within Visual Studio

An upgrade illustrated

Before AliaSQL, still running Tarantino.

AliaSQL---the-new-name-in-automated-data_13F59Tarantino build

The automated build after upgrading to AliaSQL:

AliaSQL---the-new-name-in-automated-data_13F59AliaSQL build

Notice the augmented logging the specified that a transaction was used.


In closing, go out an download AliaSQL now and upgrade your old Tarantino applications.  It’s a quick, drop-in upgrade, and you’ll be immediately ready to go for SQL Server 2012, SQL Azure, and future versions.

And finally, check out the project documentation and get involved on Github!

Thanks so much Eric Coffman, for grabbing the reins and creating this new tool that continues the heritage of this popular approach to automated database change management/database migrations.