Jenkins deployment

×

Error message

Deprecated function: The each() function is deprecated. This message will be suppressed on further calls in _menu_load_objects() (line 569 of /homepages/46/d762693627/htdocs/dc/includes/menu.inc).
Jenkins deployment

Overview

As a first step towards a new infrastructure and controlled deployment process, we have installed Jenkins on what is currently known as the Staging Server. In the terminology of the proposed new architecture, this is the Build Server and will managed-config referred to as such.

Jenkins is a Continuous Integration tool which allows us to standardise and to some degree automate the deployment of code to various environments. We create jobs in Jenkins to perform a task or create a build. A build is a snapshot of the codebase for a given branch and allow us to know exactly the state of the code we are working with. Jobs can be triggered manually, periodically or by updates to a source control manager (SCM) such as a git push.

Our Jenkins server can be found directly at build.atdtravel.com:8080. If you miss the port out, Varnish will pass through the request. Occasionally the CSS breaks due to an unknown issue. It can be solved by hard-refreshing your browser cache (e.g. Cmd-Shift + R on Chrome / OSX, Ctrl-F5 on FireFox / Windows, etc).

Deployment fundamentals

The basic premise of automated deployment is that it is based on absolutes; all elements must be in a known state and all processes must be repeatable. 

The concept of making some changes on Live is obsolete. From now on, all configuration changes must be managed in code. There is a well-established process for doing this in Drupal using update hooks. There is a module called atd_deploy specifically for developers to add their update hooks to. There are previous usage examples in there already as well as some helper functions to make repetitive tasks easier.

For example, you have a feature branch that you want to deploy to Test Instance 1 but you know that you need to enable a module and to make it work. Rather than build the test instance and use Drush or the admin screens to manually enable the module, simply add an update hook into atd_deploy that executes drupal_install_modules($array_of_modules). 

Drush updatedb is run after every deployment so this code would then be executed post-build and the requirement for a manual task is removed. More importantly it means that this task will be performed in the same way every time it is deployed. When this code is eventually deployed to Live, the same update hook will be executed and you should be confident in achieving the same result on Live as you did when you developed it locally and against a test instance.

As an aside, this way of working also makes for easier collaborative working. There's nothing more annoying than having to download a new database snapshot for some required functionality when you've got a load of development settings and test data on your local machine.

Basic job configuration

See also: How to create a Jenkins job.

Most jobs consist of two main parts; SCM configuration (repo / branch) and build steps to do whatever is required to make it work. Rather than go down the route of shell scripting which can easily spiral out of control and turn into a maintenance nightmare, I've opted to use Apache Ant build scripts. The build xml and properties files are contained in the Configuration Management repository.

Although it's not enforced, it's best to leave spaces out of job names and use underscores instead.

For jobs that shouldn't be run simultaneously, such as building dev / refreshing dev or downloading a snapshot and refreshing, you can set up locks in the main Jenkins configuration. For example, there is a Dev lock which means that when a build is automatically kicked off whilst a refresh is taking place, it will be queued until everything is ready rather than failing to run drush against an incomplete site.

Current job summary

JobTriggerDescriptionBuild_dev_instancePolls SCM for update every minuteDeploys code to *-dev.atdtravel.com and runs drush updatedb.Fetch_live_db_and_filesEvery 24hrs at 5amTransfers a copy of the live db snapshot and rsyncs the live files directory to the build server.Refresh_dev_instanceManualRebuilds dev db from the latest snapshot, sanitises data, fixes domain configuration and runs drush updatedb.Refresh_jenkinsPolls SCM for update every minuteDeploys the configuration management repo to /opt/build-files/managed-config

Further explanation

Remote trigger

Build_dev_instance and Refresh_jenkins doesn't need to rely on cron job. There is Jenkins API for remote control and it can be trigger by git hook itself after each push. So it can follow more observer pattern rather than force repository to report it state back every minute!

Build_dev_instance

Despite its name, this job has been set up to continuously build the DSD migration feature branch. This is with the intention of changing it over to the Development branch once we start using a Git branching strategy.

After creating the build Ant is invoked to execute the build-dev target defined in atd-build.xml. This is basically a wrapper used to provide parameters to the more generic build-phoenix target which recreates symlinks and runs drush updatedb.

  • Implement remote trigger

Fetch_live_db_and_files

This is a simple job that just executes the following Ant targets: retrieve-db-backup and rsync-files. An SSH key has been set up so that the Jenkins user on Build can access Live Phoenix and DSD without requiring a typed password. The 5am execution should allow enough time for the back-up process to have been run.

Refresh_dev_instance

Another simple job, executing only the refresh-dev Ant target. In a similar vein to build-dev, this is a wrapper to provide parameters to the generic refresh-phoenix target which re-creates the db and imports the snapshot. It then sanitises the data. This entails removing any of the sensitive data from that was in the raw snapshot, such as email addresses and resetting all passwords to a fixed value of 4td-p4ssw0rd-0 to make testing easier. It also runs drush updatedb.

This job could be combined with Build_dev_instance if it was only being built on a daily schedule but since it's continuously built it would just slow things down too much. Under normal circumstances it would make sense to set this job to process nightly after the new database snapshot has been downloaded with developers given the option to kick it off whenever else they choose.

Refresh_dsd_dev_instance

This job executes the refresh-dsd-dev Ant target. It is set to run daily at 6am. It re-creates the dsd_migrate database using a backup which is downloaded during the Fetch_live_db_and_files job.

Refresh_jenkins

The main purpose of this job is to provide an easy way for developers to edit build scripts remotely and have them immediately usable by Jenkins. This will continuously build the master branch of the configuration management repo and deploy it to /opt/build-files/managed-config on the Build server.

  • Implement remote trigger

Jobs still to create 

Extend refresh_jenkins

In lieu of any server provisioning like Chef, it's probably worth extending this job to push out builds of the configuration repo to the same location on all servers, including live, so it's really easy to manage updates to things like the Varnish template and have it in a convenient location to put live.

Build / refresh test instance

Developers need to be able to build a test instance on demand. This is useful for both their own testing and for providing a stable test area for others. Given the size of the development team, it's probably worth starting out with about six instances and creating more if required.

These two jobs will be clones of the current dev jobs but will need to be parameterised so that the developer can choose which branch to deploy and which test instance to use. 

Build and deploy release to Live

Assuming we're following a git branching strategy, this job will build from master and copy the files to the live server(s). This can be done easily with the Publish over SSH plugin which is what I had running on my prototype but I think it would be better looking into a solution that can be managed from within the build script so that everything is kept in the same place and under source control. This can be achieved with SCP and SSHEXEC tasks.

Build staging environment

Once we are firmly on the road to scheduled releases and git branching strategy it is worth getting some kind of staging environment to do final release testing against. There's no reason why the release branch couldn't be deployed against a test instance so it's not the highest priority.

The complexity of this job depends very much on what kind of staging environment is used. If it's a static server then it's just pushing code and refreshing. If it's an on-demand cloud VM set-up there are plugins available to allow Jenkins to build servers for the testing period. A low-cost alternative is using Jenkins to build a suite of Vagrant VMs on the Build box and then build / refresh as required. 

You might want to trigger this job by a push to the release branch if you're feeling particularly smug. A tear-down job would also be required to save on resources post-release.

Access

UsernamePasswordbuild-admin

4td-Bu1ld-Adm1n

Debugging

From time to time, a job can fail. There can be many reasons for this. For example, the directory could have changed on the target server, the user could have been deleted or the key might no longer work, there is a permissions error on a file etc.. Each time there is a failure, you just need to consult the logs and work through the reported errors.

 

 

=====

 

Set up

ItemLocationDetailsTemp Dir

/var/lib/jenkins/tmp

The standard /tmp directory had noexec set so Jenkins cannot execute the files it stores in there. A new Jenkins directory gets around the issue.Config File

/etc/sysconfig/jenkins

The main config file for Jenkins (port changes etc)Git Repository

/var/lib/jenkins/scm-sync-configuration/checkoutConfiguration/

The location of the files that are backed up in Git so that configuration changes can be trackedSCM Sync Configuration plugingit@bitbucket.org:atddev/jenkins.gitThe Bitbucket branch for the Jenkins Jobs, see below for more details

Note: The configuration is source controlled via a branch in Bitbucket.

Access

Jenkins can be accessed on the following path, http://ci.atdtravel.com:8080/. It cannot be accessed outside of the office. If it becomes inaccessible, it may be down to a Firewall rule change.

In order to log into Jenkins, use one of the following set of credentials:

UsernamePasswordDescription4dm1nn1nj4123The admin account for setting up jobs, user and configurationd4pl0yn1nj4321The user used for executing jobs for deployments and drush requests etc

Sudo access:

su -s /bin/bash jenkins

The ssh details can be found in /var/lib/jenkins/.ssh/.

Add New User

In order to add a new user for Jenkins, go to the Add User page.

Setting up a Jenkins user on a new VM example:

adduser jenkins-example
passwd jenkins-example
usermod -g atd -G jenkins-example jenkins-example
ssh-keygen -t rsa
touch /home/jenkins-example/.ssh/authorized_keys
chmod 600 /home/jenkins-example/.ssh/authorized_keys 

On the ci1 VM, you'll need to sudo as Jenkins to the new server in order to get the details added to the known_hosts file. See above for how to sudo as the Jenkins user. In addition, for things like a code pull, you'll need to run it manually on the VM first.

New remote hosts are added into Jenkins the configure system page.

Server User

Jenkins has corresponding users on our VMs that it can use to ssh in without the need for a password. The ssh is an authorized key. These include:

ServerUsernamePasswordPurposeweb1jenkins-web1n1nj4123Used for running drush commands and shell scriptsweb6jenkins-web6n1nj4123Used for running drush commands and shell scriptsnfs1jenkins-nfs1n1nj4123Used to deploy codecache2jenkins-cache2n1nj4123Used to clear varnish in primeweb10jenkins-web10n1nj4123Used to clear the D8 cachedsd1jenkins-dsd1n1nj4123Used for running drush commands and shell scripts for DSDdsd2jenkins-dsd1n1nj4123Clone of dsd1. Access required to clear redis.dsd3jenkins-dsd1n1nj4123Clone of dsd1. Access required to clear redis.stage-02jenkins-stage02n1nj4123D6 Stagingstage-03jenkins-stage03n1nj4123D7 Stagingstage-04jenkins-stage04n1nj4123D7 DSD Stagingstage-05jenkins-stage05n1nj4123D8stage-06jenkins-stage06n1nj4123Gateway Stagingstage-dbjenkins-databasen1nj4123Databasesweb12jenkins-web12n1nj4123Gateway

 

Local Setup

A Vagrant VM is available for local testing. This VM is called 'ci', short for Continuous Integration. This can be used for testing potential jobs before being added to the live environment. It can also be used to test plugins.

UrlUserPasswordhttp://jenkins-ci.dev:8080/Needs to be set up on start upNeeds to be set up on start up

Bitbucket

User Account

Access to Bitbucket for the Jenkins users is:

EmailUsernamePassworddev+admin+jenkins@atdtravelservices.co.ukatd-jenkins4tdjenk1ns

Configuration & Source Control

The configuration git branch for the Jenkins configuration can be found on Bitbucket. All the configuration should source controlled via the SCM Sync configuration plugin. This includes the configuration of Jenkins and all the jobs that have been created.

Automatic Code deployments

DSD live job build url: http://ci.atdtravel.com:8080/buildByToken/build?job=Staging%20-%20Server%20Config%20Code%20Deployment&token=DcJrt_tBeF9gN

It is possible to trigger a build in Jenkins via a Bitbucket Webhook.

The initial attempt to get automatic code deployments working involved setting up a Jenkins job that could be triggered externally. Using a token set in the job, a url to access it is made available. In Bitbucket, this url can then be called when the repository is updated in some way. In Bitbucket, this is known as a webhook. Using ssh, the job itself then accesses the server(s) hosting the code and runs git pull origin.

There are various plugins that may offer better solutions than this. They have been included in the setup of Jenkins and can be investigated when the functionality is required.

Whilst looking into auto code deployments, I encountered timeout issues on the pushes from Bitbucket. However, this was down to it not being possible to access Jenkins externally. After amending the Firewall rules so that Bitbucket could access Jenkins, the job began to fire.

Jenkins is currently locked down as you need to login to access it. In order for the job url to be accessible, the Build Token Root Plugin is used.

Creating a new job

The best way to decide how to set up a job is to look at the ones that already exist. 

The initial process involves the following steps:

  1. Log in as the admin user
  2. Click 'New Item' on the left hand side
  3. Select 'Freestyle project'
  4. Click 'OK'

The majority of the jobs involve executing a shell script on one of the servers. You are not limited to one build option per job. You can add as many as required. Within the build section, you do the following:

  1. Click 'Add build step'
  2. Select 'Execute shell script on remote host using ssh'
  3. Select the server you want to execute the script from
  4. Add command into textarea.

The 'Build Triggers' section allows to specify a cron job style run rule. You can also specify that it runs straight after another Jenkins job.There are also options to throttle the build such as 'Throttle Concurrent Builds' to make sure that the job is only running once at any given time. You can specify jobs to run after the build has completed such as clearing the cache. Note: Activating Check Norris is optional in the after build step.

If you want to achieve something but the functionality does not seem to be available, there could be a plugin that could help you. To add a plugin do the following:

  1. Log in
  2. Click 'Manage Jenkins' on the left hand menu
  3. Click 'Manage Plugins'

Support

  1. I cannot access the login screen
    Check whether the jenkins.war file was recently updated / if a new version has been released, /usr/lib/jenkins/jenkins.war (http://mirrors.jenkins.io/war/). If a new version has been added, it may stop things from working and you may need to roll things back.

Useful Links

 

 

blog tag: