Jump to section

What is container orchestration?

Copy URL

Red Hat named a Leader in the 2023 Gartner® Magic Quadrant™

Red Hat was positioned highest for ability to execute and furthest for completeness of vision in the Gartner 2023 Magic Quadrant for Container Management.

Container orchestration automates the deployment, management, scaling, and networking of containers. Enterprises that need to deploy and manage hundreds or thousands of Linux® containers and hosts can benefit from container orchestration. 

Container orchestration can be used in any environment where you use containers. It can help you to deploy the same application across different environments without needing to redesign it. And microservices in containers make it easier to orchestrate services, including storage, networking, and security. 

Containers give your microservice-based apps an ideal application deployment unit and self-contained execution environment. They make it possible to run multiple parts of an app independently in microservices, on the same hardware, with much greater control over individual pieces and life cycles.

Managing the lifecycle of containers with orchestration also supports DevOps teams who integrate it into CI/CD workflows. Along with application programming interfaces (APIs) and DevOps teams, containerized microservices are the foundation for cloud-native applications.

Use container orchestration to automate and manage tasks such as:

  • Provisioning and deployment
  • Configuration and scheduling 
  • Resource allocation
  • Container availability 
  • Scaling or removing containers based on balancing workloads across your infrastructure
  • Load balancing and traffic routing 
  • Monitoring container health
  • Configuring applications based on the container in which they will run
  • Keeping interactions between containers secure

Container orchestration tools provide a framework for managing containers and microservices architecture at scale. There are many container orchestration tools that can be used for container lifecycle management. Some popular options are Kubernetes, Docker Swarm, and Apache Mesos.

Kubernetes is an open source container orchestration tool that was originally developed and designed by engineers at Google. Google donated the Kubernetes project to the newly formed Cloud Native Computing Foundation in 2015.

Kubernetes orchestration allows you to build application services that span multiple containers, schedule containers across a cluster, scale those containers, and manage their health over time.

Kubernetes eliminates many of the manual processes involved in deploying and scaling containerized applications. You can cluster together groups of hosts, either physical or virtual machines, running Linux containers, and Kubernetes gives you the platform to easily and efficiently manage those clusters. 

More broadly, it helps you fully implement and rely on a container-based infrastructure in production environments.

These clusters can span hosts across public, private, or hybrid clouds. For this reason, Kubernetes is an ideal platform for hosting cloud-native apps that require rapid scaling.

Kubernetes also assists with workload portability and load balancing by letting you move applications without redesigning them. 

Main components of Kubernetes:

  • Cluster: A control plane and one or more compute machines, or nodes.
  • Control plane: The collection of processes that control Kubernetes nodes. This is where all task assignments originate.
  • Kubelet: This service runs on nodes and reads the container manifests and ensures the defined containers are started and running.
  • Pod: A group of one or more containers deployed to a single node. All containers in a pod share an IP address, IPC, hostname, and other resources.

When you use a container orchestration tool, such as Kubernetes, you will describe the configuration of an application using either a YAML or JSON file. The configuration file tells the configuration management tool where to find the container images, how to establish a network, and where to store logs.

When deploying a new container, the container management tool automatically schedules the deployment to a cluster and finds the right host, taking into account any defined requirements or restrictions. The orchestration tool then manages the container’s lifecycle based on the specifications that were determined in the compose file.

You can use Kubernetes patterns to manage the configuration, lifecyle, and scale of container-based applications and services. These repeatable patterns are the tools needed by a Kubernetes developer to build complete systems. 

Container orchestration can be used in any environment that runs containers, including on-premise servers and public cloud or private cloud environments.

Real production apps span multiple containers. Those containers must be deployed across multiple server hosts. That’s where Red Hat® comes in.

Red Hat OpenShift® ​​​​​is Kubernetes for the enterprise—and a lot more. OpenShift includes all of the extra pieces of technology that makes Kubernetes powerful and viable for the enterprise, including: registry, networking, telemetry, security, automation, and services.

And with tools like Red Hat Service Interconnect, routers and gateways provide trusted communication links between services on different clouds, edge devices, generic Kubernetes and OpenShift.

With Red Hat, developers can make new containerized apps, host them, and deploy them in the cloud with the high availability, scalability, control, and orchestration that can turn a good idea into new business quickly and easily.

Try, buy, and manage certified software across public clouds, private clouds, and your datacenter. That’s the power of Red Hat Marketplace. It’s a simpler way to access the software you already rely on, build in a unified Kubernetes-based environment, and deploy anywhere.

Red Hat Marketplace means you’ll spend more time developing innovative solutions, not tracking down licenses, entitlements, and expirations.

Keep reading

Article

What's a Linux container?

A Linux container is a set of processes isolated from the system, running from a distinct image that provides all the files necessary to support the processes.

Article

Containers vs VMs

Linux containers and virtual machines (VMs) are packaged computing environments that combine various IT components and isolate them from the rest of the system.

Article

What is container orchestration?

Container orchestration automates the deployment, management, scaling, and networking of containers.

More about containers

Products

An enterprise application platform with a unified set of tested services for bringing apps to market on your choice of infrastructure.

Resources

Podcast

Command Line Heroes Season 1, Episode 5:
"The Containers Derby"

E-Book

Boost agility with hybrid cloud and containers

Training

Free training course

Running Containers with Red Hat Technical Overview

Free training course

Containers, Kubernetes and Red Hat OpenShift Technical Overview

Free training course

Developing Cloud-Native Applications with Microservices Architectures