Abhijit Hota

Extremely fallible

A sorry excuse of a blog where I write about development, life and opinions I have

Containerisation for DevOps

  • #devops
  • #containers

Background

In October 2022, A certain company reached out to me to write an article about DevOps and containerization. I never got to complete it in time and when I did, they didn’t respond.

I did a good amount of research for this article, provided I did not know much about the space. Now that I work for a DevOps shop, I thought it was only fair to publish this article.

I don’t think the writing is the best but it’s better to publish it than to let it rot in my drafts.

You can directly skip to the section about software development lifecycle if you’re already aware of DevOps. Despite, I think the immediate next sections act as a good primer.

DevOps

Despite being the simple amalgamation of the words, development and operations, DevOps as a concept is inherently misunderstood. It isn’t a word well defined in literature. Although the common consensus however is that it’s ideally about a culture in your team which maximises ease and quality of releases and development-to-deployment speeds leveraging collaboration, tooling and other methodologies.

According to GitLab, DevOps is

people working together to conceive, build and deliver secure software at top speed. [it] enables software developers (devs) and operations (ops) teams to accelerate delivery through automation, collaboration, fast feedback, and iterative improvement.

DevOps itself is never concerned with the actual application business logic. Like a management methodology, it is about making development and operations team more integrated, making the software supply more efficient in the process.

Characteristics of devops

The meanings and henceforth the rules of DevOps change org to org but there are certain properties that must be aligned in the DevOps movement:

What is Containerization?

Containerization, in simplest of terms, means isolation. It is one way of doing OS-level virtualization, which basically means running a program, be it a simple application or a whole another OS, on top of your current OS.

A lot of containerization software exist in the industry today. However the one that completely turned the landscape by providing a easier UI is Docker. For the rest of the article, when we say containers, we mean Docker containers.

The problems faced in DevOps

Development of almost all software can be boiled down to the 7 steps: plan, build, test, package, secure, release, deploy and monitor. This is called the software development lifecycle. A DevOps driven team would need to care about the efficiency of all these steps and sure enough, teams face issues in all of the steps. We’re going to see what are some of the common problems faced in some of these steps.

Why Containerization?

The problems described above demand a few things in common. Faster development, immutability of builds and security via isolation. With containerisation, all of these are addressed to in one way or the other.

Reproducible environments

Docker follows a declarative text-file based approach for creating images via its configuration file called the Dockerfile. It is a one-stop shop for all your configurations and versions of tools. You can manage this

What is declarative? Instead of writing code that says “Start this app on port 8080”, you say “port = 8080” and start the container.

This feature of containers makes them absolutely easy for abstracting and composing different environments. Want a staging environment? No problem. Change some variables and spin up another container without worrying about what will happen to the production environment.

Versatile builds

In the previous section, we saw the problems that come with packaging and security. Docker images are meant to be read-only and immutable. This works greatly for security as we don’t want our build to be tampered with. After an release image/container is built, it is pushed to a registry like GitHub Container Registry or Docker Hub.

Creating a Docker image or container also means that you don’t have to worry about configuration drift or dependency mismatch. Everything that is needed for a particular application will be bundled into the image, thereby giving consistent builds. Not to mention, since the Dockerfile it’s basically just code, can be version controlled via an SCM like Git.

The benefit of images not only comes when releasing but also when rolling back inevitable bugs that get into the application. If your image releases are versioned, which they are in most cases, all you have to do to is pull up a previous version from your registry, spin it up and take the current one down.

Consistent development tools

You can ship not only your entire production application wherever and however you want to, but also speed up development with including tools inside your containers. New developer joining your team? New container for them.

Tools like VSCode Dev Containers make this process super easy. You can have project wise development containers and all a new developer has to do is pull and start the container.

Testing via Isolation

We discussed the importance of testing and its problems. How do we properly ensure that tests are made against as rigorous scenarios as possible without harming our development or CI system? Run them inside containers!

Technologies like ory/dockertest provide a first-class experience in spinning up entire databases inside containers and testing your code in them!

Security via Isolation

As we discussed in the earlier paragraphs, when we use containers, we package and ship applications with their dependencies, rather than having to rely on the underlying operating system. This significantly reduces the risk of vulnerabilities in the application. This also protects against the dependencies installed in the OS being exploited.

With proper auditing, at least a smoke-test verification of all the dependencies that goes to our containers, we can protect our apps against a lot of threats.

Another thing to note is that each container runs in its own isolated environment. This isolation prevents containers from interacting with each other, which reduces the chances of security breaches or accidental leaks. You can start a billion containers and be sure that if one fails, others will remain intact, or at worst, fail because of different reasons.

Resource conscious

Containers are much lighter than their heavy virtualization counter-parts - virtual machines.

In traditional virtualization, something called hypervisors emulate hardware to run Virtual Machines. Containerization on the other hand is about just running another process on top of your base OS using various kernel mechanisms like chroot, namespaces and cgroups.

The benefit that comes with ditching the hypervisor can be seen in real life examples. For example, Google starts around 2 billion containers per week! And that was in 2014. That gives us an idea about how lightweight they actually are.

Containers also allow us to limit the resource usage like memory or CPU with easy configurations.

Enables Collaboration

A last point that might be a bit stretched but can definitely become a reality is containers can be run by anyone. The designer in your team doesn’t have to learn about npm or Webpack to look at the application. They just need to install Docker on their machine, download the code, and start the container.

Albeit not a better solution over staging environments or tunnels, teams actually use this sometimes!

Conclusion

Containerization is often cited as one of the key technologies that enables DevOps culture. They allow for much more rapid and consistent deployment of applications, and they make it much easier to manage dependencies and isolate applications from each other. This makes it possible to deploy applications much more frequently, and to do so with much less risk.

When we go back to the characteristics of DevOps and see how containers solve the problems posed in the software development lifecycle, we can see it all coming together.

Containerization make workflow automation better by making the CI more efficient with isolated testing. They make packaging and releasing a breeze with images and containers. They enable iterative development and quick feedback by making it easy for anyone to reproduce environments suited to the scenarios being tested.

In the end, it’s safe to say that containerization is a capable catalyst for incorporating DevOps methodologies in the team. It is no silver bullet but they make complex application deployments quick and reliable.