ALM, databases, TFS build vNext (2015) and Redgate: A tutorial for my SQL in the City demo

I’m in Seattle chilling out around PASS Summit, meeting old SQL friends and duelling with jetlag. A couple of days ago I did a talk with Brian Randell MVP (b|t) at SQL in the City. The talk featured a demo of a CI process for SQL Server using Team Foundation Build vNext (TFS 2015) and Redgate’s SQL CI tool (part of the DLM Automation Suite). I thought I’d take some quiet time to write a post to explain how to set up something similar for yourself.

First off – here’s the video:

Step 1: Upgrade TFS

I love Microsoft, I really do, and I love SQL Server. But TFS Build was an unintuitive and poorly designed set of components that were impossible to use in a practical way. What is the point in ‘controllers’? Why do I have to configure it through VS and not through a web interface? XAML, really?!

Anyway, I use the past tense because Woah have things changed! Team Foundation Build 2015 and the hosted version, Visual Studio Online (VSO), are a different story. Like a Phoenix from the ashes, Microsoft have produced a tool that is genuinely attractive, nice to use and competitive with other CI tools like Jenkins, TeamCity and Bamboo. It has a live build log, a side by side diff view of old vs new build configurations and it has far better support for doing anything other than running MSBuild.

I’m converted.

If you are still using TFS 2013 start making plans to upgrade. Do it now. Rarely is there a software update that provides as much value as this.

I’m not going to talk about the TFS installation/upgrade process because it’s basically a next, next, next job and I’m sure it is documented somewhere else in more detail somewhere that is actively maintained. Also, if you use VSO you don’t need to install TFS – win.

This tutorial uses on-premise TFS but the process would be almost identical for VSO. The challenge is that you need to install the Redgate DLM Automation Suite on your build agent, which can be more challenging with hosted or single use VMs.

Step 2: Create a new build definition for your project

I’m assuming you already have a TFS project with either a TFS or Git style source control repository and that you use Redgate SQL Source Control to maintain your database scripts there. Now, for this project, you need to create a build definition.

A build definition is a set of instructions you give to TFS. It allows you to tell TFS what to do and when to do it. In our case we’ll use a build definition to tell TFS to run a bunch of commands.

One of the really cool things about the new TFS is I don’t need to wait for Visual Studio to load in order to read, create or edit a build definition. I can do it from the web interface. This is much easier and faster.

  1. Open TFS in a browser and go to the BUILD tab.
  2. Click the green + button in the top left corner.
  3. Select a build definition template. In my case I simply used an empty template (at the bottom).

buildDefinition

Step 3: Add build steps

This is one of the headline features of new TFS. You can click the “Add build step” button and choose from a list of different sorts of actions. The whole experience of designing a build process feels much more like what we have come to expect from tools like TeamCity and Octopus Deploy.

What’s more, Microsoft is really embracing other platforms and there are even standard built-in steps to do really weird stuff with androids and penguins.

buildSteps

We don’t need anything nearly as exotic as that, a simple command line step will meet our needs:

  1. From your newly created build definition click “Add a build step”.
  2. Select a “Command Line” build step from the Utility selection.
  3. For the “Tool” field write “sqlCI.exe”. (You’ll also want to add the path to sqlCI.exe to the Path system variable on your build agent. This will make for simpler testing and configuration).
  4. Paste the following into the Arguments field, replacing the relative path and DB name appropriately:
BUILD /scriptsFolder=RELATIVE_PATH_TO_DB_SCRIPTS /packageId=NAME_OF_YOUR_DB /packageVersion=0.$(Build.BuildNumber)
  1. Click save. Then click “Queue build…” to test it works. If it doesn’t work review the build log to troubleshoot errors. (The new live build log is another awesome new TFS feature.) If you can’t figure out why not tweet me – it’s literally my job to help.

You may notice the variable $(Build.BuildNumber) in my command. This is yet another awesome feature of new TFS. It has much better support for both custom and system variables. This particular variable allows us to easily pass the TFS build number through to the command in order to generate packages with version numbers that correlate with my build numbers. (Note, you need to use a TFS build number format that is supported by NuGet for this to work.) Simple stuff like passing build numbers through to MSBuild used to require tonnes of horrible XAML scripting. Not any more.

You can see a list of the TFS system variables that are available to you here:

https://msdn.microsoft.com/en-us/Library/vs/alm/Build/scripts/variables

What this command should do is build a new database from your source code in localDB, then delete it, and finally generate a NuGet package for you. You can tweak this behaviour using the command line switches that are documented here:

http://documentation.red-gate.com/display/SCI1/Using+the+command+line

For example, you can specify that the build should run on a specific SQL instance rather than local DB and you can use SQL Auth instead of WinAuth (default). You can even run the build on a specific named database which can be really useful when dealing with file groups, dependencies between databases or draconian policies about permissions to create new databases.

If the build passes you know that the database code is deployable. There are no syntax errors or dependencies that would break a deployment. Your database will “compile”. (@Terminology police: cut me some slack. “Compile” is accurate enough for our purposes.) You have passed the lowest bar and simplest test. If your build does not pass you should stop all dev until your build is green since no other dev has any value if the thing is undeployable.

Step 4: Run tests

So far we have only tested if the code is deployable. We aren’t testing for regressions or bugs. However, we have created a really great foundation on which to build some level of automated testing.

Regardless of your views on testing strategies it is hard to argue that it is justifiable not to do it at all. Simply put, generally database folk suck at confidently making changes without breaking stuff. Automated tests would help us enormously and tSQLt (open source) and Redgate SQL Test are a pair of neat tools that can really help. If you are using them this step will ensure your tests are run against every commit.

I’m not going to go into any detail here but if you are using tSQLt and unit testing your database you can run your tests here with another command. Repeat sections 1-5 above to add another Command Line SQL CI step but this time use the following arguments:

TEST /package=$(Build.SourcesDirectory)\NAME_OF_YOUR_DB.0.$(Build.BuildNumber).nupkg

You’ll notice here that I’m using more TFS system variables and that you’ll need to provide the DB name again. It is important you enter it here exactly as you did in the previous command. (You might like to avoid repetition by adding your own variables in the variables tab so you only need to enter your DB name once.) Again, you can use more switches as documented on the Redgate documentation site to tweak the behaviour, for example, to only run a subset of your tests.

By default this command will use the package generated by our first command to build a new database and run all your tSQLt tests. You can configure it to be more selective if you want and there are clever ways to parallelise tests if they take a long time and you want results more quickly.

Step 5: Deployment

In my demo I used Octopus Deploy for deployments, but you could equally well use VSO. The configuration follows a very similar process where you add various deployment steps, just as we have done for our builds.

If you want to see how I used Redgate SQL Release to handle my deployments I used an approach very similar to that documented here. That means all that was left was to get TFS to hand off my NuGet package to Octopus Deploy and to send Octopus a message to execute a deployment.

I achieved this by adding two more Command Line steps. The first published the package to my NuGet feed using Redgate SQL CI. In my case I used the built in Octopus Deploy feed. The second used the Octopus Deploy command line, octo.exe, to trigger the deployment to my integration environment. For this task I needed to install octo.exe on my build agent and I added its file path to my Path system variable just like I did for sqlCI.exe.

My SQL CI arguments for the publish build step were as follows. You’ll need to replace the feed URL and API key with your own. There are additional parameters to add passwords and usernames etc if required:

PUBLISH /package=$(Build.SourcesDirectory)\SimpleTalkDB.0.$(Build.BuildNumber).nupkg /nugetFeedUrl=http://MY_OCTOPUS_SERVER:PORT/nuget/packages /nugetFeedApiKey=API-QWERTYUIOP1234567890

For Octopus Deploy I used the following arguments:

create-release /server=http://MY_OCTOPUS_SERVER:PORT /apiKey=API-QWERTYUIOP1234567890 /version=0.$(Build.BuildNumber) /project=MY_OCTOPUS_PROJECT /deployto=MY_INTEGRATION_ENVIRONMENT /force /waitfordeployment

Full documentation for octo.exe is available here:

http://docs.octopusdeploy.com/display/OD/Octo.exe+Command+Line

Step 6: Triggering

All that is left to do now is add a trigger. I want to run this build every time some code is commited to source control. To do that go to the Triggers tab within your build definition and add a “Continuous Integration (CI)” trigger:

trigger

Plug-ins

In my demo I used the command line calls directly because I wanted to explain the process in a way that is easy to understand what is happening under the hood. I also wanted to make it very obvious how people who use automation tools other than TFS could achieve the same task. However, Redgate does have various plugins available for TFS 2013, VSO and (coming soon) TFS 2015 on-premise. Redgate also has plug-ins for Jenkins, Bamboo and TeamCity.

These plug-ins add a set of special Redgate build steps that can be used instead of the “Command Line” build step. These mean you don’t need to mess around learning all the command line syntax and can instead simply tick radio buttons and type database names etc into text fields. These plug-ins are easier to use and can make it easier for other team members to understand how your builds work and to maintain them.

In summary

Configuring your build process in the new TFS is a radically improved experience. Where before you had to deal with XAML, work in Visual Studio and build MSBuild projects and C# solutions to call SQL CI now you can easily tell TFS to run a set of simple commands. TFS has a significantly more useable interface with all sorts of features that are really useful, like the improved variables and broad variety of tasks.

The excuses for not implementing at least some level of automated testing are running out. If you want to be able to deliver code more frequently and with more confidence than your competitors you simply have to invest in some sort of automated testing of your source code commits. The Redgate tools combined with the remarkably improved TFS Build provide a good platform to implement this.

Also, attending SQL in the City and PASS Summit has been a tremendously valuable experience. I wish I could stay longer but sadly I have a plane to catch. It’s wonderful to catch up with so many of my SQL Family and develop our professional relationships and growing friendships.

If you didn’t make it to Seattle this year I strongly recommend you come along next year. I’ll certainly be making an effort to stay longer in 2016.

  1 comment for “ALM, databases, TFS build vNext (2015) and Redgate: A tutorial for my SQL in the City demo

Leave a Reply

Your email address will not be published.