Building your CI/CD pipeline on AWS

Dinuka Arseculeratne
9 min readNov 15, 2018

--

Having worked on building pipelines using tools such as Bamboo, Jenkins, Circle CI et al, I wanted to explore how things come together in a cloud environment. AWS provides three tools with the ability to build your pipelines. The tools provided are(drum roll please);

  • AWS CodeBuild
  • AWS CodeDeploy
  • AWS CodePipeline

Hmmm they sure did not spend much time on the names for the products there, but hey it works so its all good. On this post, we will be building out a pipeline with a mini GitHub project I have setup on my repository which you can fork out if needed. It is a simple Spring Boot application that exposes a REST API. Finally we deploy it to an EC2 instance and test it out with Postman to see if everything is working as expected. I love seeing the bigger picture before we start out, so here it is;

Building the code pipeline on AWS

So first of all my GitHub repository(yea not that exciting) can be found here. I have always been a big DC comics fan from my childhood so I do tend to incorporate them in every possible situation I can. My wife even bought me a batman pajama suit recently(she still loves Marvel more though…). Getting back on track, I will not be going into the details of the code found on GitHub since it is quite trivial. A simple REST API to get the Justice League(Not even the full team).

Before we being, make sure to bring up an AWS EC2 instance with the following configured;

  • Role with access to S3(This is needed because the AWS CodeDeploy agent needs to fetch the deployment artefacts from S3)
  • Install CodeDeploy agent as mentioned here
  • Make sure to create a tag with a key/value of your choice(we will need to later on during the AWS CodeDeploy stage)
  • Security group with the following;

Yes I have opened everything up to the whole world. Please do not send me death threats, just wanted to maintain brevity :)

The Custom TCP rule is used so that we can test it out via PostMan at the end as we need to open up the Ephemeral ports for the traffic to go out. More about ephemeral ports can be found here.

Now that we have the code, the next thing to do is to go setup AWS CodeBuild . On this article we will focus on how to do it via the AWS Console and on a later post, we will automate everything(because we like automation so much) using CloudFormation. Before we get started, we need to setup the S3 bucket since everything is copied to S3 during the lifecycle of the pipeline in AWS. Done creating the S3 bucket? Good, lets get going. If you look at the source code in the GitHub repository, you will see a file named buildspec.yml. This file specifies the steps needed to be take as part of the build process. Let us have a look at this file;

version: 0.2

env:
variables:
JAVA_HOME: "/usr/lib/jvm/java-8-openjdk-amd64"

phases:
install:
commands:
- echo installing maven...
- apt-get update -y
- apt-get install -y maven
build:
commands:
- echo building the justiceleague-tracker
- mvn install
artifacts:
files:
- target/*.jar
- scripts/*.sh
- appspec.yml
discard-paths: yes

cache:
paths:
- '/root/.m2/**/*'

As AWS CodeBuild spawns the build nodes upon request, you will need to tell it to install the package that you need. CodeBuild also gives you the ability to use your own Docker image from AWS ECR or DockerHub if you wanted to in which case you would have pre-installed the packages you need. For this example, I have gone ahead with using a builder node managed by AWS CodeBuild.

The other interesting section on the buildspec is the cache . As you know, with Maven, the dependencies are downloaded if they are not present within your local repository during the maven build process. Unless you have a something like Nexus or Artifactory that caches your maven dependencies, each build will try to download it from the maven repository. With the cache functionality, what AWS does is cache what ever directory you mention which gets uploaded to S3 as part of the build process. With this, whenever a new build is spawned, AWS will re-use the cached content by which you minimise the time take for the overall build process. In this instance, I am caching the maven repository path.

We start off with naming our build project;

After this, lets move on to mentioning our source provider which in this case is the GitHub repository. I have linked my repository here as you can see;

Next up we specify the image we need to run our build. In this case, I have selected an AWS CodeBuild managed instance. Do note that a new service role will be created as part of this process. This role will grant S3 and CloudWatch;

Moving on, the next section is about the buildspec file we talked about before. If you have not named it the same or if it was in a different path, this is where you can customise it;

Lastly, you will configure how you want to artefact packaged. In this instance, I have configured it to be copied to S3 as a zip because AWS CodeDeploy needs it defined as a Zip so that it can do its deployment process. You can also specify the S3 location for copying the cached content we talked about before.

That is about it for setting up your build project. You should now see it with all its glory on your AWS CodeBuild. We now move on to setting up AWS CodeDeploy for our deployment process. Before we do this, we have to setup a new AWS role for CodeDeploy. This role will need the AWS code deploy policy as well as read/write access to S3. Mine looks like this;

Before we being, lets go back to our GitHub repository where you will find a file called appspec.yml which is the file that mentions to steps to be take as part of the installation phase of CodeDeploy;

version: 0.0
os: linux
files:
- source: /
destination: /home/ec2-user
hooks:
AfterInstall:
- location: fix_previleges.sh
timeout: 300
runas: root
ApplicationStart:
- location: start_server.sh
timeout: 300
runas: root
ApplicationStop:
- location: stop_server.sh
timeout: 300
runas: root

In our case, we first copy the deployment artefacts to our home directory after which we first set the correct privileges in order to execute our script files. We then have a start and stop script defined. If we look at the start script, it is defined as follows;

#!/bin/bash
java -jar -Dspring.profiles.active=dev /home/ec2-user/justiceleague-tracker-0.0.1-SNAPSHOT.jar > /dev/null 2> /dev/null < /dev/null &

When I first started off, I only had it defined as follows;

#!/bin/bash
java -jar -Dspring.profiles.active=dev /home/ec2-user/justiceleague-tracker-0.0.1-SNAPSHOT.jar &

But this ended up hanging the overall deployment process. Then I stumbled upon this page which mentioned that long running processes needs to be re-directed which is why I had to amend it with;

> /dev/null 2> /dev/null < /dev/null

The first step is to setup a CodeDeploy application;

After this, we nee to setup a deployment group. This is where you tell CodeDeploy the instances you need to deploy your application to. In our case, I will be using the tag I created for my EC2 instance to create the deployment group. You can however even mention an autoscaling group if you wanted to;

First you give a name for the deployment group;

Follow that up with assigning the AWS role you created before for CodeDeploy;

Next you select the deployment type. To keep this simple, I have kept it as In-place but as you can see we can even do blue/green deployments if needed;

In the next section you mention how CodeDeploy will find your instances. In my case, I select the tag I created before to identify by EC2 instance;

Finally you provide the deployment settings you need where in my case I have selected all at once. If you wanted to do a rolling update, you can do that too. As I am not using an ELB in this example, I have not ticked the load balancer;

Now that we are done with that, we finally move onto creating our pipeline with all its glory which will bring everything together to build and deploy our application to our lonely EC2 instance;

We start off with giving the pipeline a name and assigning an IAM role. In my case, I already had a role defined which I use. You can let CodePipeline create a bucket for you, or you could use a bucket you already have which is what I have done in this instance;

Next step is to link our GitHub repository as follows. Note that I have given the privilege to create a web hook so that CodePipeline does not have to keep polling for new changes;

We then mention our AWS CodeBuild project we setup as our build provider. AWS Pipeline is very flexible here in that, you could even hook in your own build agent if needed like Jenkins;

Time to then hook up the CodeDeploy project we setup before as follows;

That is it and you finally review and create your pipeline after which your build is triggered. Here is my final successfully run pipeline in all its glory;

Finally, we test our glorious Justice League API with Postman;

There we have it. Our Justice league tracker system integrated to our CI/CD pipeline on AWS. Proud moment indeed. Go have a drink now. You deserve it.

In my opinion, AWS does make it easier to setup a pipeline with a few clicks and configurations. Having worked on setting up a pipeline with Bamboo, I must say that Bamboo does provide a better way of building and maintaining your pipelines with features such as linked repositories and built in plugins for other integrations as needed. But the AWS tools sure does play nice with the rest of the AWS infrastructure where you could integrate it with CloudWatch for alarms and monitor it centrally which I believe adds more value. Another plus point with AWS tools was the fact that you have the flexibility of spawning build agents when needed rather than having servers running even when it is not needed. It also gives you the capability to add your Jenkins agents as build agents if need be.

Overall, the integration was seamless and the out-of-the-box features provided saves a lot of time(blue/green deployment, auto-scaling etc) in the overall process.

Have a good day ahead everyone and thank you for reading until the end(if you did make it).

--

--

Dinuka Arseculeratne
Dinuka Arseculeratne

Written by Dinuka Arseculeratne

A coding geek, gamer, guitarist and a disciple of Christ Jesus. That would be me in a nutshell!