You’ve got all the environments in place for building a database deployment pipeline, and now you want to prepare them to enable yourself to actually build one. What preparations do you have to do? You can start with preparing your Build & Deploy environments.
Schematic overview of the Database Deployment Pipeline
In the first post of this series I showed you the set of environments that we use for database delivery. In this post I’ll show you how I prepared these and other involved environments to enable them for Continuous Integration and to build Database Deployment Pipelines. I am going to extend the overview of our set of environments, the DTAP street, with the Build & Deploy environments:
My goal was to:
- Set up our Build Environment (TeamCity) in a way that enables it to watch the database source control repositories, and automatically create builds when changes are committed to those repositories.
- Set up our Deployment Environment (Octopus Deploy) to enable it to deploy these builds to the D, T, A and P environments with the push of a button.
For that, I had to install Redgate’s DLM Automation Suite in the build environment, and install the same suite on our deployment server and its agents.
Installing Redgate’s DLM Automation Suite in the Build Environment
Our build environment at that time consisted of a build server which (for reasons I mentioned in my previous post in this series) I call our CI Server, and several build agents. Our CI Server is the server where we configure everything involving the build processes, and our build agents do the actual building. Schematically:
Note that the build agents run on their own servers. When a build configuration on the CI Server is triggered to create a build, for instance when a database change is committed to source control, it triggers one of the build agents. I also included the Package Management Server in the overview, because the build artifacts are stored in its repository.
As a DBA, it is not your job to manage CI and build servers, and most likely your colleagues from operations do this already. So now is the time to get them involved by letting them know you want their help to install the DLM Automation Suite on the CI server and (some of) the build agents. Installing the DLM Automation Suite is pretty straightforward and instructions on how to install the suite can be found here. Be sure to install it on your CI server and at least one build agent.
Installing Redgate’s DLM Automation Suite in the Deployment Environment
Our Deployment Environment consists of an Octopus Deploy server, where we configure releases based on builds, and from where we deploy these releases to the DTAP environment. The actual deployment work is done by an agent which is called an Octopus tentacle, and which runs on the server where the release is deployed to. Schematically:
Note that if you are still running Windows Server 2003, it is not possible to install the DLM Automation Suite. You can still make it work by running the tentacle for the SQL deployments on your Octopus Deploy server and let the tentacle deploy releases to the desired environments. This is what we did, and in this case the schema looks like this:
Just like with the CI and Build servers, it’s not your job to manage these agents, and just like with the installation of Redgate’s DLM Automation Suite, ask your colleagues to help install the suite on your deployment server. Install an Octopus tentacle on each database server together with the DLM Automation suite and your environments are ready to be configured for automated database deployment pipelines!
If your company has already set up a Build & Deploy environment, it is easy to install Redgate’s DLM Automation Suite in them. This will let you create automated build configurations, and configure your deployment server to enable your teams to create database releases, and deploy them with a push of a button. How to actually create those configurations will be part of the next posts.