Welcome to the world of containers
Organizations around the globe are creating more personalized customer experiences in order to retain and grow their customer base and revenues. They are staying closer to their customers’ needs through the adoption of containers and Kubernetes.
By turning to containers and Kubernetes, they are able to quickly deliver new applications, and migrate existing ones, to the cloud for more agility. Increased agility helps propel the innovation cycle whether it is rapidly building and deploying new applications or improving the customer experience (CX). For example, an airline carrier migrated their legacy system to a hybrid cloud environment using Red Hat OpenShift®, increasing code deployments from one per week to over 200 per day.
Enabling containers is more than just downloading Kubernetes. There is an ecosystem of solutions that when packaged together provide a platform for building, deploying, monitoring, and managing containerized workloads. When evaluating a platform to support innovation, security and automation are at the heart of the decision. Platforms need to be able to scale, heal, and constantly evolve.
In order to improve security, a platform needs to be architecturally designed around immutability to limit the potential attack vectors. This type of immutable architecture will also allow for simpler, seamless updates so operations teams can quickly respond to patches that address new vulnerabilities. Increased automation will support scalability and stability to help promote a more consistent and secure experience as applications multiply and grow to support the business’s digital initiatives.
With the right platform in place, it is time to transform ideas into reality. For example:
- Retailers can provide a seamless, personalized omnichannel/channel-less CX.
- Manufacturers can give their shop floor employees a safer and more productive work environment.
- Healthcare organizations can move beyond tracking medical devices and have prescriptive analytics running on their equipment to improve patient outcomes.
But until a platform can help turn your ideas into reality, you could be missing out on opportunities. Red Hat® OpenShift® delivers security-focused, scalable operations to IT teams and application developers for their on-premise, multicloud, or hybrid cloud deployment requirements.
What changes with containers?
There are some common methods to update the way to build applications and help ensure their success. From transforming a monolithic application to new development for cloud-native workloads, the evolution in developing and deploying containers has accelerated massive change. Integrating with DevOps tooling helps to connect operations and developers to quickly deploy new applications. With the container model, each application can more easily be deployed on a number of infrastructure targets, from on-premise to multicloud or hybrid cloud.
One of the first steps towards digital transformation lies in the migration of monolithic applications into a cloud-ready, containerized architecture. Using that example, each of the functions that comprise the application have been written and tested as a single, large package. With a move to a containerized deployment, each individual function can be delivered in its own container and scaled independently. The migration is often done one function at a time, ensuring that each individual function, as well as the remainder of the application, are running correctly before moving on.
The most prevalent use case for cloud-native application design is when delivering a brand new application. With this type of application, it is easier to begin the project with different assumptions that allow for the build and deployment model to center on containers. New application development efforts are likely building a microservices-based app and are looking to integrate with continuous integration and continuous delivery (CI/CD) build methodologies that allow for more frequent release cycles. The core of the development workflow would be an automated build process designed to rapidly test code iterations for quality. As code changes pass testing, they can be released as independent updates to the application in their own containers. Each individual containerized function can scale to address performance bottlenecks or reside on different infrastructures that support specialized services.
One of the fastest growing use cases for containerized applications is in the analytics space for artificial intelligence and machine learning (AI/ML). Whether your project is focused on business process automation, cognitive insights, or engagement, developing and deploying AI/ML solutions is becoming a key business application. Optimizing the access and efficiency of specialized hardware resources such as graphics processing units (GPUs), field-programmable gate array (FPGAs), and Infiniband is critical to enhance performance-sensitive AI/ML workloads. Building this solution in a containerized environment enables additional flexibility and portability to better adapt these workloads for use. From training the model through real-time analysis, the ability to introduce AI/ML solutions has become a critical initiative for many organizations.
Customers have wide choice in Kubernetes solutions, including do-it-yourself (DIY) platforms built on upstream projects, managed services on public clouds, and other self-hosted platforms. Red Hat OpenShift stands out as a leading choice for customers who want a more secure, supported Kubernetes platform guided by deep expertise.
Why Red Hat OpenShift?
Regardless of how you are moving to containers, the right platform enables additional features beyond deploying and managing container environments. Red Hat OpenShift has been built to support the needs of growing container environments that are require an enterprise-grade platform to deploy on. Red Hat OpenShift adds value to Kubernetes with built-in authentication and authorization, secrets management, auditing, logging, and an integrated container registry for granular control over resources and user permissions.
TRUSTED ENTERPRISE KUBERNETES
The foundation for deploying containers and Kubernetes lies in the Linux® operating system. Red Hat is a community leader and builder in Kubernetes and container projects, building on our open source expertise to drive significant innovation in upstream projects. The architecture of Red Hat OpenShift 4 is based on the immutable Red Hat Enterprise Linux CoreOS, enabling a tighter level of integration and security for Kubernetes environments. Red Hat Enterprise Linux CoreOS is designed to be managed and run at a massive scale, with minimal operational overhead.
Start with a lightweight Linux operating system distribution that includes just the essentials: the operating system and basic userland utilities stripped to their bare minimum and shipped as an integral unit. By standardizing each installation of Red Hat OpenShift on Red Hat Enterprise Linux CoreOS, the foundation for an immutable platform is established. Continuity between each platform instance is enhanced while updates to the full stack can be managed from the operating system through the application stack.
Red Hat OpenShift adds comprehensive, continuous security to upstream Kubernetes. The platform integrates tightly with Jenkins and other standard CI/CD tools for security-focused application builds. Users can further extend the security of applications on Red Hat OpenShift from independent software vendors (ISV) solutions validated through our expansive partner network.
Stateful applications built and deployed using containers also need storage that is easy to use, highly available, and flexible. Red Hat OpenShift Container Storage was created specifically for container-based environments and is highly integrated with Red Hat OpenShift Container Platform. OpenShift Container Storage enables application portability, allowing customers to make the most of their investment in containers and hybrid cloud technologies, encouraging faster development cycles for stateful applications and services.
With a growing hybrid multicloud ecosystem, monitoring and management becomes very complex. Without a consolidated view into the breadth of Kubernetes clusters across all infrastructures, projects can quickly spin out of control. Visibility into all OpenShift clusters is accessed using cloud.openshift.com for a unified, hybrid cloud view of your containerized workloads.
A CLOUD-LIKE EXPERIENCE, EVERYWHERE
Time spent on routine platform and service maintenance across multiple footprints is time away from building critical services and customer experiences. Red Hat OpenShift automates life-cycle management for the container all the way down to Linux and helps decouple workloads from infrastructure—giving teams more time and freedom to innovate anywhere they choose to build and run.
Red Hat OpenShift 4 has been built to support Kubernetes Operators to streamline and automate installation, updates, and management of container-based services. Imagine deploying or updating a database, monitoring service, or building a system across an entire cluster with the same ease of installing a smartphone application. Operators enhance the delivery of services to include configuration settings that enable consistent deployment of services across multiple instances.
Operating system updates and security patches can be regularly pushed to machines without requiring intervention by administrators. With this foundation, Red Hat OpenShift 4 automates the installation of the full stack—from the underlying infrastructure (Amazon Web Services, vSphere, Microsoft Azure, Red Hat OpenStack® Platform, etc.), to the Linux OS (Red Hat Enterprise Linux CoreOS), to the Kubernetes platform and integrated services (Red Hat OpenShift). When applications are distributed across hybrid multicloud environments, these automatic updates with Red Hat OpenShift dramatically improve security without causing service downtime. The result is full-stack continuous security from the operating system to the application, and throughout the software life cycle.
When designing Red Hat OpenShift 4, attention to improving the installation experience for the Kubernetes cluster was built into the platform. With the unique Red Hat OpenShift management model extending down to the operating system, an installation is simply treated as an upgrade. By thinking this way, the initial installation is simply an update from nothing to the initial state. The installer is required to bootstrap certain minimal default values of the cluster but embraces the available Operators starting with the initial configuration.
For many containerized applications, the initial installation is easy. However, when these applications need to be configured, updated, or backed up, specific operational knowledge and business logic is required to ensure these more complicated tasks are handled correctly. At scale you further multiply the knowledge required to manage these applications, often requiring large amounts of IT coordination, from network permissions to systems allocation, to backup, logging, and service updating.
To truly deliver consistent and simple consumption of these applications by developers, you need a way to package the business logic with the application in an automated and repeatable way. When delivering a new containerized application that is built to scale on demand, each discrete instance must be configured identically. All the business logic for configuration and operations needs to be included with the core service or application to deliver this consistently. Red Hat OpenShift combined with Kubernetes Operators provides this out of the box.
When services are deployed across a number of containers, each service needs to talk to each other. The challenge of managing the complex interactions between containers in highly distributed microservices environments should not be a burden placed on developers. A service mesh takes the logic governing the service-to-service communication between containers out of the individual services and abstracts it to a layer of infrastructure. To enable the service mesh, Red Hat OpenShift uses components from open source projects Istio, Jaeger, and Kiali. By integrating these together in a single package, the OpenShift Service Mesh delivers the interconnect, tracing, and visualization of the service mesh in action. This enables rapid troubleshooting of the service mesh and allows operations teams to quickly react to any changes in application delivery.
The benefits of Red Hat OpenShift are available in multiple platform variants to match the consumption model our customers prefer. For teams that are prepared to manage Red Hat OpenShift themselves, OpenShift Container Platform can be installed on-premise or through the major cloud providers. Red Hat OpenShift Dedicated is a complete Red Hat OpenShift cluster provided as a cloud service and managed by Red Hat Operations. It is configured for high availability (HA) and dedicated to a single customer (single-tenant), backed by award-winning 24x7 Red Hat premium support. Microsoft Azure Red Hat OpenShift is a fully managed Red Hat OpenShift offering on Azure, jointly engineered, operated, and supported by Microsoft and Red Hat.
EMPOWERING DEVELOPERS TO INNOVATE
Red Hat pushes the boundaries of what containers and Kubernetes can do, driving innovation for stateful applications, virtual machines (VMs), functions, and machine learning on Kubernetes.
Using Operators to install and manage your containerized applications makes them easier to deploy, more reliable, and kept up to date in the security patching arms race. With services backed by Operators, the development team focuses on challenges that deliver more value to your business and to your customers. The underlying service can be updated by the IT operations team to maintain system security across multiple applications without requiring each development team to stop work on their code.
Similarly, the Service Mesh takes issues such as service discovery, load balancing, fault tolerance, observability, security, and others out of the application. Each of these functions is enabled by the Service Mesh defined by IT operations teams for use by the applications developers are creating. By taking these functions out of the application-level libraries, it allows developers to focus on business logic in their code.
Developers are continually searching for alternative ways to execute their code. With cloud-native applications, being able to scale to zero has become a goal to limit operational costs of running their functions. Setting up a server that runs 24x7 to host code that is not in constant use means that the vast majority of the time customers are paying for processing power that is not being used. With a serverless instance, no resources are consumed unless your code is running.
To support serverless use cases, Red Hat OpenShift makes use of Knative to establish a foundation to manage serverless workloads. By targeting the serverless framework to use Knative application programming interfaces (APIs), it is possible to bridge serverless and Kubernetes. When deploying a service, Knative will start and stop the service as needed. It will coexist with and be mixed with alternative architectures and is part of a general trend toward simplifying the developer experience and making developers more productive when creating flexible, scalable, and robust cloud-native applications.
Once the platform has been enabled to support hybrid cloud deployments, the development teams are able to focus on developing applications. By integrating directly with the existing workflow and toolchain solutions, developers can focus less on the platform and more on their code. The platform integrates tightly with Jenkins and other standard CI/CD tools, or Red Hat OpenShift’s built-in workflows and tools, for security-focused application builds.
Red Hat OpenShift also allows for the use of a developer-focused command-line interface (CLI) that abstracts the infrastructure management from code management. When building with the Red Hat OpenShift platform, you can inspire innovation and get applications and services to production sooner.
Preparing for success
As a platform for innovation, Red Hat OpenShift includes everything you need to run containers and Kubernetes consistently across any on-premise, private, or public cloud infrastructure. Power business transformation and unite your teams on a cost-effective, single platform to quickly deliver the exceptional experiences your customers expect, anywhere they are. Reduce total cost of ownership for all your applications and supporting infrastructure, with service and application portability across on-premise and cloud environments. With Red Hat OpenShift, your most innovative people can focus on what matters and continually outpace customer expectations and deliver the big ideas that change everything.