In the last articles on the subject of Jenkins CI integration with .NET project we talked only about builds, tests and analysis. This is great, but the main benefit of CI server is the delivery process. In this article we will review how to deliver continuously. Note that these are just a couple of suggestions and tools you can use to do the job, the actual continuous delivery process is a very big topic and is not the subject of this post.
Visual Studio Web Deploy
When you wont to deliver to different environment, You need different configuration files for every environment. The reason for this is because different environments or clients build need different connection strings, file servers, etc. Microsoft is providing a web.config transformations. In order to create a transformation file for all environments you need to:
- Go to ‘Build -> Configuration Manager…’.
- On your presentation project (Web, Desktop, etc.) create configuration for all your environments like Test, Stage, Production Client 1, etc…
- Got to your Web.Config and on the context menu select ‘Add Config Transform’. This will add transformation file for all your environments and you can edit it.
- Now you need to create a publish profile for all your environments. You can do this by going to your project context menu and select ‘Publish…’.
- Publish to the environment you like.
For more transformation options see the links in references.
On your publish servers you need Web Deploy installed, except if you are publish on local network filesystem.
Editing the publish profile
On some deploys you will need to include extra files like dynamically loaded dlls or other needed files. In order to do this you need to edit all your publish config files (should be located at MY_PROJECT_FOLDER/Properties/PublishProfiles/*.pubxml).
The following script add to the publish archive ‘/bin’ folder the dll files that start with ‘App.Lib’ and the ‘AuthorizeNet.dll’.
<Target Name="CustomCollectFiles"> <ItemGroup> <_CustomFiles Include="bin\App.Lib.*.dll" /> <_CustomFiles Include="bin\AuthorizeNet.dll" /> <FilesForPackagingFromProject Include="%(_CustomFiles.Identity)"> <DestinationRelativePath>bin\%(RecursiveDir)%(Filename)%(Extension)</DestinationRelativePath> </FilesForPackagingFromProject> </ItemGroup> </Target> <PropertyGroup> <CopyAllFilesToSingleFolderForPackageDependsOn>CustomCollectFiles;</CopyAllFilesToSingleFolderForPackageDependsOn> <CopyAllFilesToSingleFolderForMsdeployDependsOn>CustomCollectFiles;</CopyAllFilesToSingleFolderForMsdeployDependsOn> </PropertyGroup>
Jenkins Web Deploy
On the Jenkins CI you will want to deploy to the different environments. When you do this you may want to set some extra parameters in the config files, like some application version or logging options. You can do this with an XPath tool like this one. Example of how to use it is:
REM 'Change Configs On Pre-Deploy.' "C:\Tools\XPath.exe" -file "%WEBPROJECT%\Web.config" -xpath //configuration/appSettings/add[@key='AppVersion']/@value -value "1.%BUILD_NUMBER%.%SVN_REVISION%" "C:\Tools\XPath.exe" -file "%WEBPROJECT%\NLog.config" -xpath //ns:nlog/ns:rules/ns:logger[@writeTo='mail']/@minlevel -value "Error" -ns http://www.nlog-project.org/schemas/NLog.xsd
After that we deploy with MSBuild. If you have a publish plan named „DEPLOY.TO.STAGE“, You can deploy it like this:
Command Line Arguments
/p:DeployOnBuild=true /p:PublishProfile=DEPLOY.TO.STAGE.pubxml /v:q /nologo
Third party services
Often you will want after a build or deploy to notify a third party application or service. The problem is that there isn’t plugins for all applications out there and some of the plugins do not work for every version of Jenkins.
For example that was the issue when I tried to notify https://www.pivotaltracker.com/ on successful job complete. There is existing plugin for integration with pivotal, but for some reason it didn’t work with my version of Jenkins.
So how to communicate with such third party apps and services. You have 3 options:
- Find a plugin that work. There are many plugins in Jenkins and some of them can help you. Just go and try to find one here before doing anything else.
- Write your own plugins. If you are a fan of Jenkins you can always contribute to the project by writing a plugin that integrates with one service or another. The problems is that I have found it hard to even start developing plugins for Jenkins, because the process is not so easy. And some time you just want a quick and easy solution.
- A third option is to use the Jenkins CI API. Jenkins provide a option to extract the current build parametars in a xml or json format. You can filter the parameters with xpath, transform it and send it to a third party service.
In order to use the API you will need a access token. You can set it in ‘Your Jenkins Profile -> Configure -> API Token’.
After that you can access the build properties simply by calling the %BUILD_URL%/api/xml. You can add wrapper and xpath query parameters. Also you have to add basic authentication with your jenkins username and api token.
An example of application that take the changes and send them to pivotal can find here.
The idea of continues integration and delivery is that you do it periodically or when given event occurred.
In that way you do not do a big and painful integrations and deliveries.
The continues integration is when all developers keep their work integrated.
You can read more about integration and delivery here.
You can use the ‘Poll SCM’ triggering of integration process. On a given period of time Poll SCM checks the remote repository for changes and start the job if any new pushed are detected. The better ways is to trigger a build on commit. One way to do this is to use post commit hooks that send a POST request to the JENKINS_URL/job/JOBNAME/build?token=TOKEN. Some servers like GitLab have integrated solutions for triggering a build right after push.
This post commit builds can integrate future branches into the master branch, do a merge request or just compile the project to make sure the project can be build.
There is many configurations that can be made.
One possibility is to create a post commit that only build the projects and run the tests. After that a daily build that integrate into master branch and weekly that run the acceptance tests and create a tag that can be deployed to test environment.
The setups can be different depending on the project needs.
In the first post of the series we have described how to use a pre-master branch. You have other possibilities with Jenkins.
Developers can use future branches and Jenkins can integrate or create a merge request with master. The Git plugin allowing it.
Another option is to use a feature toggles. You can include a future framework like this ones (preview some of them here). With the Web.config transformations you can activate different futures for the different environments and clients.
One of the useful things when to finish your integration to the master is to periodically extract a documentation of your code and overall description of your project.
This can be deployed to a public site or to though a Jenkins HTML Publish plugin.
The continuous delivery is a process of continually delivering stable versions to your environments. A different plugins like Violations can set your build as stable and unstable.
After stable build we have the opportunity to create tags. In that way stable builds will be ready for deploy to test or stage environment where testers and clients can test and review the application and the new futures.
The continuous delivery process can contain many configurations like post-commit, daily, weekly jobs, jobs that deploy to different environments, etc. We can create jobs for every configuration and execute the same commands over and over again, but this will become harder to maintain and harder to comprehend. A better way is to organize all jobs for a given project in delivery pipeline (see references). After setting the delivery pipeline (by triggering jobs one after another) you can visualize the process by pipeline view plugins like Delivery Pipeline Plugin and Build Pipeline Plugin.
Sometimes you want to promote a version from test environment to stage environment or above. However it is better to do this, if the QA team is deemed the version stable. So you will need to allow external users of Jenkins to trigger builds manually. This can be done with triggers.
Manual triggering of a job can be done by email. There is some plugins that can perform this like Mail Commander Plugin and Mailbox Trigger Plugin. They can do the job, but do not have validation or authentication if you require it.
Another way to allow manual trigger of specific jobs is to write your own tool or script that access the Jenkins CI API for triggering jobs. You can write a small page or do a script and send it to the people that will promote the builds to different environments.
When you want to promote to different environments it is a good idea to preserve the code in artifact and then to transform and deploy it. One possible MSBuild project that you can use is located here. You can use it in your project and modify it for your needs. If you call it with the following command it will copy all files for publish at single location and all Web.config transform files.
Command Line Arguments
- The WEB_TARGETS_PATH is the path for your web publish targets. This can possibly be „C:\Program Files (x86)\MSBuild\Microsoft\VisualStudio\v12.0“ or other VS version file.
- The PROJECT_NAME is your project name.
The next command on the same MSBuild project file will transform the Web.config with Web.Test.config and create a package file for deploy.
Command Line Arguments
MSBuild.xml /t:PrepareDeploy /p:PublishTargetsDirectory=%WEB_TARGETS_PATH%;
Now you can use msdeploy to deploy to test environment.
Delivery process diagram
- Web Deployment: Web.Config Transformation
- Jenkins Parameterized Builds
- Git, Feature Branches, and Jenkins – or how I learned to stop worrying about broken builds
- Continuous Delivery: Anatomy of the Deployment Pipeline
- Orchestrating Your Delivery Pipelines with Jenkins
- How to use Jenkins for Job Chaining and Visualizations