.NET DevOps for Azure

I’ve been working hard to bring together all that I have learned over the past years into my new book: .NET DevOps for Azure.

It is a culmination of a long-time vision, some key leadership, and a confluence of industry events.

Almost fifteen years ago, the I gained a passion for helping .Net DevOps engineers and DevOps services companies succeed, for making the complex simple, and for finding rules of thumb that would work for 80% of situations. With too many options in the software world and too many answers of “it depends”, the industry has been starved for the ability to do something “by the book.”

This book presents a scenario where a .NET developer can say “I’m doing Azure and .Net DevOps by the book.” In this manner, one would know what models and patterns were in play and what to expect from said environment.

The examples largely use Visual Studio 2019 preview edition. However, the code and the Azure DevOps Services pipeline function with .NET Core 2.2 and can be used to implement applications.

The example configuration used throughout this book can be leveraged through a public project and source code repository online.

Visit Amazon to order. [Click Here]

Or email me jeffrey@clear-measure.com to get a free eCopy of Chapter 3: The Professional-Grade DevOps Environment!

.NET DevOps Bootcamp

Architect + Lead Engineer Hands-On 2-Day Immersion hosted by ME! Jeffrey Palermo!

Is simplifying your software development and processes something that you’d like to see happen in your organization?
If so, this class was made with you and your team in mind!   

Join me!
  .NET DevOps Bootcamp: Architect & Lead Engineer Hands-On Immersion 
Hosted by Jeffrey Palermo
January 16th-17th
Austin, TX

I’m excited to be able to offer this 2-Day training. Walking you through the simple 7 key steps to simplify your .NET DevOps world where I simplify the development and deployment process – making it applicable to your every day.   

Attendees will learn concepts, apply the learning and also implement the latest DevOps tools for Microsoft-based applications, including Azure DevOps Services, Git, Azure Pipelines, Azure PaaS environments, and Octopus Deploy.  

You’ll also get:

My current favorite private build script

I do pull in the Exec function from psake just because it was coded very well.  This build script is just powershell and is geared for .Net Core

. .\BuildFunctions.ps1

 

$projectName = “OnionDevOpsArchitecture”

$base_dir = resolve-path
.\

$source_dir =
$base_dir
\src”

$unitTestProjectPath =
$source_dir
\UnitTests”

$integrationTestProjectPath =
$source_dir
\IntegrationTests”

$uiProjectPath =
$source_dir
\UI”

$databaseProjectPath =
$source_dir
\Database”

$projectConfig = $env:BuildConfiguration

$version = $env:Version

$verbosity = “q”

 

$build_dir =
$base_dir
\build”

$test_dir =
$build_dir
\test”

 

$aliaSql =
$source_dir
\Database\scripts\AliaSql.exe”

$databaseAction = $env:DatabaseAction

if
([string]::IsNullOrEmpty($databaseAction))
{ $databaseAction = “Rebuild”}

$databaseName = $env:DatabaseName

if
([string]::IsNullOrEmpty($databaseName))
{ $databaseName =
$projectName}

$databaseServer = $env:DatabaseServer

if
([string]::IsNullOrEmpty($databaseServer))
{ $databaseServer =
“localhost\SQL2017”}

$databaseScripts =
$source_dir
\Database\scripts”

   

if
([string]::IsNullOrEmpty($version))
{ $version = “9.9.9”}

if
([string]::IsNullOrEmpty($projectConfig))
{$projectConfig = “Release”}

 

Function Init {

    rd
$build_dir -recurse -force  -ErrorAction
Ignore

       md $build_dir
>
$null

 

       exec {

              &
dotnet clean
$source_dir\$projectName.sln
-nologo -v
$verbosity

              }

       exec {

              &
dotnet restore
$source_dir\$projectName.sln
-nologo –interactive -v
$verbosity
 

              }

   

 

    #Write-Host
$projectConfig

    #Write-Host
$version

}

 

 

Function Compile{

       exec {

              &
dotnet build
$source_dir\$projectName.sln
-nologo –no-restore -v
$verbosity -maxcpucount –configuration
$projectConfig –no-incremental
/p:Version=$version /p:Authors=”Clear
Measure”
/p:Product=”Onion
DevOps Architecture”

       }

}

 

Function UnitTests{

       Push-Location -Path
$unitTestProjectPath

 

       try {

              exec {

                     &
dotnet test -nologo -v
$verbosity –logger:trx
–results-directory $test_dir –no-build
–no-restore –configuration
$projectConfig

              }

       }

       finally {

              Pop-Location

       }

}

 

Function IntegrationTest{

       Push-Location -Path
$integrationTestProjectPath

 

       try {

              exec {

                     &
dotnet test -nologo -v
$verbosity –logger:trx
–results-directory $test_dir –no-build
–no-restore –configuration
$projectConfig

              }

       }

       finally {

              Pop-Location

       }

}

 

Function MigrateDatabaseLocal {

       exec{

              &
$aliaSql $databaseAction $databaseServer
$databaseName
$databaseScripts

       }

}

 

Function MigrateDatabaseRemote{

       $appConfig =
$integrationTestProjectPath
\app.config”

    $injectedConnectionString =
“Server=tcp:$databaseServer,1433;Initial
Catalog=
$databaseName;Persist Security
Info=False;User
ID=
$env:DatabaseUser;Password=$env:DatabasePassword
;MultipleActiveResultSets=False;Encrypt=True;TrustServerCertificate=False;Connection
Timeout=30;”

   

       write-host “Using connection string:
$injectedConnectionString

    if (
Test-Path $appConfig )
{

        poke-xml
$appConfig “//add[@key=’ConnectionString’]/@value”
$injectedConnectionString

    }

 

       exec {

              &
$aliaSql $databaseAction $databaseServer
$databaseName $databaseScripts
$env:DatabaseUser
$env:DatabasePassword

       }

}

 

Function Pack{

       Write-Output “Packaging
nuget packages”

       exec{

              &
.\tools\octopack\Octo.exe pack –id
$projectName.UI” –version
$version –basePath $uiProjectPath
–outFolder
$build_dir

       }

       exec{

              &
.\tools\octopack\Octo.exe pack –id
$projectName.Database”
–version $version –basePath
$databaseProjectPath –outFolder
$build_dir

       }

       exec{

              &
.\tools\octopack\Octo.exe pack –id
$projectName.IntegrationTests”
–version $version –basePath
$integrationTestProjectPath –outFolder
$build_dir

       }

}

 

Function PrivateBuild{

       Init

       Compile

       UnitTests

       MigrateDatabaseLocal

       IntegrationTest

}

 

Function CIBuild{

       Init

       MigrateDatabaseRemote

       Compile

       UnitTests

       IntegrationTest

       Pack

}

 

 

Palermo Pamphlet launch – episode 001


My goal is to teach, inform, and have a little fun. But I want this to provide value for programmers shipping custom software using Microsoft tools. Here is the first episode of the Palermo Pamphlet, which I hope will be a valuable resource to you.

Here are the show notes

Performance tuning an Azure DevOps build configuration

We’ve all seen the culprits that constantly add time to builds.  One might observe that your NPM install or Nuget restore can take several minutes.  I remember back to the times of CC.Net in 2005 when a small application build could happen in 45 seconds, including unit tests.  And 10 minutes as a “thou shall not go over this” threshold.  So we cannot allow NPM or any other step to take minutes.  We have to ferret that out.
The answer is the same as code performance profiling.  Find out where every build is spending the same time doing work that adds no value or doesn’t vary often. Then we cache the result.  For so many builds, these are the culprits that take time but typically aren’t the changes that are being tested from build to build:
  1. Obtaining a build server (when choosing hosted build agents)
  2. Cloning the source
  3. Package restores
  4. Copying/archiving build artifacts
Here are my common solutions for reducing these common culprits (I’d be interested to know how others have eliminated these time sucks)
  1. Use our own Azure VMs as the build agents (running multiple agents on a single VM) – always available at a moments notice
  2. Let Azure Pipelines be a little less aggressive with cleaning source and instead have the build script delete the build directories at the beginning – removes need for a full clone and can just be a pull (works most of the time and requires probably a monthly purge for a clean clone, but saves SOOO much time)
  3. a) retain cloned working tree so that the previous package restore is used for subsequent builds or b) check in packages so that package restores are not necessary for every build
  4. Once builds are working and reliable, only archive the build artifacts that are directly used by the release pipeline (typically the nuget packages that house the application components)

Why I started the Azure DevOps Podcast

I wanted to share a little bit of my reasoning for starting the Azure DevOps Podcast.  The above video is about 4 minutes long.  Please take a look.  Feel free to play at double speed.  The gist of it is I like seeing developers having fun.  Busted releases are not fun.  All-weekend deployments are not fun.  New things breaking when you make a code change are not fun.  And software development can be SO MUCH FUN!  I’ve spent 21 years so far as a professional programmer – not counting the toy apps and websites outside that time. I still love it, and I would like to share that joy with anyone I can.

Here are some links I mention in the video:

 

Applying 4+1 Architecture Blueprints to Continuous Delivery

imageYou may just be learning about the iterative, emergent architecture method known as “4+1”.  You can read the original paper here.  Written by Philippe Kruchten in 1995, this 15-page paper lays out the views that are needed in order to communicate the architectural elements of a software system. Coursera has a good summary video in their course catalog here.

Architects in other professions go through similar thought processes as software architects, so it can be useful to borrow the graphical outputs that these other architects generate as illustrations of the decisions made while finding a solution suitable to the problem.

Continuous Delivery is an umbrella terms used to describe the process for an automated system that takes changes to the source and configuration of a system and flows those changes through a process and ultimately to a running production system while catching quality issues. Jez Humble maintains a very useful site dedicated to continuous delivery here.

I recently presented a sample 4+1 architecture to the Cloud Austin user group. Since I received several requests for the diagram, I’m posting it here. If you have any additional questions, contact me.  I’m always happy to help.  Additionally, I have a high-resolution PDF of this diagram (ARCH D in size, if you would like to print it on a plotter).

DevOps 4 1 Architecture Blueprints 200

How to incrementally adopt VSTS for devops automation

VSTS has fantastic devops-enabling features, but one of the common issues when planning to adopt it is how to plan the large migration of work and project plans from other tools.  The good news is that you don’t have to change anything about where you are tracking your work or projects in order to make use of builds, packaging, or automated deployments.  If you track your work in Jira, your code in GitHub, then you’ll want to configure VSTS like the following:

image

If my VSTS does not look like yours, you’ll want to turn on the new navigation features by selecting your profile picture in the top right corner.

image

With this configuration, you’ll be able to:

  • Configure continuous integration builds
  • Run unit tests & component-level integration tests
  • Package versioned release candidates
  • Architect release candidate packages in the onboard Nuget server
  • Create releases for a particular build
  • Map the progression of pre-production environments as well as production
  • Deploy release candidates
  • Run full-system tests against deployed instances of your builds
  • Trend statistics about your tests
  • Integrate static code analysis into your devops pipeline
  • Configure and run load tests

Many teams have systems to track projects and existing source code repositories in place.  When bringing online devops methods, it may be appropriate to focus the VSTS project on just the capabilities needed at the moment.