Skip to main content

5 reasons why you should develop a Linux container strategy

If you've shunned containers in the past, these five advantages will make you rethink containerization.
5 advantages of Linux containers

Photo by from Pexels

Containerization has gained in popularity over the past few years. However, the subject still remains elusive for some. There are many different opinions revolving around this architectural paradigm, spanning from "containers are just a way to obfuscate bad code" to "you are a dinosaur if your entire infrastructure isn't already containerized."

If you have one of these strong opinions, this post probably is not for you. The following is for those who have not had the time nor bandwidth to research the topic and are not sure how the architecture might help within their organization. In this article, I give a simple explanation of some of the advantages of adopting a container strategy within your infrastructure and give you some quick and simple tips to get started. There are, in fact, many advantages of containerization within your infrastructure, both from a technical perspective and the development lifecycle perspective.

The following discussion outlines five advantages of the container approach, along with some disadvantages and challenges that tend to arise during the adoption of this new methodology.


The concept of containerization really started in the late 70s with the UNIX operating system and the chroot system, which was meant to isolate processes and package all dependencies needed within a single space. This architecture style brought many different stability advantages to development. With many different stages and permutations in the last 30 years, Docker emerged, marking the Golden Age of containerization. Docker basically solved most of the container runtime difficulties and offered an entire ecosystem for management, allowing containers to talk to one another and also gain access to system resources.

Server utilization

First, and most simply put, server utilization is a major advantage of containerizing your applications. If architected correctly, you can get 20-30% extra performance per server with containerization. As you can imagine, this could be a game-changer for any organization that is looking to optimize its infrastructure. Containers do not require extra dependencies or operating systems to run. Because of this, they are able to run in a much more efficient manner.

[ You might also enjoy: Learn OpenShift with Minishift ]

Development cycle

One of the biggest reasons containers are a major benefit to any organization is the way they help developers engineer their apps. Containers have the entire runtime of the application built into them. Because of this, developers don’t need to worry about coding for the correct dependencies. To further elaborate, this makes for a much better coding experience that becomes especially powerful with big teams of engineers. The development environment is the same, no matter the location, compute environment, or time.


Containerization starts to become extremely powerful once the proper orchestration tool, like enterprise Kubernetes in Red Hat OpenShift, is implemented. With OpenShift operators, you can implement a powerful serverless architecture that lets your organization handle traffic spikes. Because containers can spin up and down so quickly, they are primed for serverless workloads. You can grow your infrastructure as high as your compute allows and reduce all the way down to zero seamlessly based on outward HTTP requests.


Instead of deployments being a total nightmare, they can actually be a huge benefit with the right containerization strategy. With the correct orchestration platform applied to your containers, you can control your deployment strategy with much higher granularity. For example, if you develop a patch or new feature, you can release it in stages based on a percent of users, user demographics, or geographics.


Standardizing with the right container platform is extremely important. If fully standardized, you would be able to have maximum portability for your apps and services, moving from on-premises to the public cloud all the way to the edge. This is very powerful because your developers have the same environment no matter where it’s deployed.

[ Getting started with containers? Check out this free course. Deploying containerized applications: A technical overview. ]

Wrap up

In this brief article, I’ve discussed a few of the advantages of embracing a container strategy for your enterprise. In this article, I described five reasons for adopting a container strategy in your infrastructure and gave you some quick and simple tips to get started. There are, in fact, many advantages of containerization within your infrastructure, both from a technical and a development lifecycle perspective.

Topics:   Linux   Containers   OpenShift  
Author’s photo

Peter Gervase

I am a Senior Principal Security Architect at Verizon. Before that, I worked at Red Hat in various roles such as consulting and in the Solutions Architect where I specialized in Smart Management, Ansible, and OpenShift. More about me

Author’s photo

Greg Richardson

Greg has a passion for learning and is excited to begin diving into Red Hat offerings and open source. His unique background in sales, support, and engineering allows him to enthusiastically delve into his new role as a Red Hat Solution Architect. More about me

Try Red Hat Enterprise Linux

Download it at no charge from the Red Hat Developer program.