Automating Jenkins Jobs using Build Pipeline inside Docker Container with the help of Dockerfile

In this blog, I will be automating the Jenkins inside a docker container with the help of Dockerfile and then create and execute different Jenkins job running inside a container.

⚡Task Description⚡

1. Create container image that’s has Jenkins installed using DockerFile

2. When we launch this image, it should automatically starts Jenkins service in the container.

3. Create a job chain of job1, job2, job3 and job4 using build pipeline plugin in Jenkins

4. Job1 : Pull the GitHub repo automatically when some developers push repo to GitHub.

5. Job2 : By looking at the code or program file, Jenkins should automatically start the respective language interpreter install image container to deploy code ( eg. If code is of PHP, then Jenkins should start the container that has PHP already installed ).

6. Job3 : Test your website if it is working or not.

7. Job4 : if website is not working , then send email to developer with error messages.

8. Create One extra job job5 for monitor : If container where app is running. fails due to any reason then this job should automatically start the container again.

Here we go ⏩

Firstly we create Dockerfile.

👉 First create a separate workspace for a Dockerfile . I have created a directory named jenkinsws. In this workspace create a Dockerfile.

Dockerfile

Build a Dockerfile

docker build -t <image_name>:<tag> /workspace
docker images

So let’s launch the Docker container

Launching docker container will automatically starts jenkins service.

Here we got the initial password of jenkins. This is needed to unlock Jenkins for the first time.

Now install suggested plugins. As we can see plugins start downloading…👇

👉 Here we create username and password for login to jenkins.

Bravo!! So our jenkins setup is ready to use.. ✌

First , Create a new GitHub repo , so that we can put GitHub url in jenkins job..

🔰Create Job 1

This job will pull the GitHub repo automatically when some developer pushes repo to GitHub.

👉 First give your GitHub repository URL in the Source Code Management.

👉 Now In Build Triggers section , select GitHub hook trigger for GITScm polling , so that only when there is change in GitHub code , jenkins job will automatically start.

👉 Next this is the code which gonna help to copy our files from Jenkins Workspace to our desired folder.

Console output of Job1

Our job1 is run successfully.

🔰Create Job 2

By looking at the code or program file, Jenkins should automatically start the respective language interpreter install image container to deploy code ( eg. If code is of PHP, then Jenkins should start the container that has PHP already installed ).

👉This job will be Triggering after job build of job 1, so In Build Triggers select Build after other projects are built and in Projects to watch , put job1

Console output of job2

So , job2 is successfully done

⏩ Create Job 3

job3 will check whether our website is running or not.

This job will be Triggering after job build of job 2, so In Build Triggers select Build after other projects are built and in Projects to watch , put job2

👉 Now the below script will check whether our website is giving 200 status code or not. Status code 200 means the website is working fine.

Console output of job3

So , job3 is successfully done

⏩ Create Job 4

if website is not working , then send email to developer with error messages.

👉 This job will be Triggering after job build of job 3, so In Build Triggers select Build after other projects are built and in Projects to watch , put job3.

Now this script will first check if our website is not giving 200 status code then it will send email(notification to the developer)

Console output of job4

So , our job4 is successfully done

⏩ Create Job 5

This job is used for monitoring the docker container. If our container failed at any time then this job will launch the container again.

👉 In Build periodically trigger- give “* * * * * *” . This “* * * * *” means that jenkins will check the docker container every minute of every hour through out the year. If our container failed at any time then this job will launch the container again.

Console output of job5

So , our job5 is successfully done

All 5 jobs run successfully

Finally all the jobs are completed now let’s create build pipeline

⏩ Create a Build Pipeline

👉For making a build pipeline first install the Build pipeline plugin.

👉Then click on New view to a add a pipeline.

👉Give a name and Enable the Build pipeline view option and Select initial job as Job 1

Now let’s see our pipeline :

👉 Now go to the website and type the web address of our server and will see the page working fine.

Website is working fine..

👉 Now let’s check by making a mistake in the php code. So, now I pushes the wrong code.

wrong php code

👉 Now you will see that job3 will automatically failed.

👉 Now go to the page you will see that it’s not working.

Page not loaded

Hurray!! we successfully completed all the given task. 😊✌

Thanks For Reading…💙

📌Here is my GitHub Repo Link:-

--

--

--

Engineering Student || Aspiring DevOps Engineer || Tech Enthusiast

Love podcasts or audiobooks? Learn on the go with our new app.

Recommended from Medium

Simple AWS SQS queue producer and consumer

What happen when you hit refresh on your computer screen?

Finding company information with Eikon (for MBA)

How to Switch from Excel to R Shiny: First Steps

How to Build a Successful FinTech Application? — Complete Guide

Modularising an iOS app for faster build times

How much do you invest to build OTT platforms like Netflix or Hulu?

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Krishna Pal

Krishna Pal

Engineering Student || Aspiring DevOps Engineer || Tech Enthusiast

More from Medium

Creating streamlined docker images

How To Install SQL Server on Docker

Install Docker in Windows and deploy Spring boot Micro Service

Deploying a Scalable SQL Cluster on Kubernetes