When industry influencers and CIOs talk about the future of computing, they typically aren’t only discussing hardware advancements or cloud-based software. Increasingly, these conversations center on transformation through application innovation, providing new predictive services to customers that are driven by an integrated user experience. This could be something like inspecting customer data patterns to promote new banking services, analyzing health indicators to proactively recommend treatment or an immersive interface for personalized interactions.
Whatever the end product, it’s about gaining a competitive advantage in an ever-evolving, highly-competitive marketplace through technological advancement. Enter containers. Containers enable these applications to evolve faster, increase developer velocity and bring a greater level of portability and consistency regardless of underlying infrastructure.
Gartner predicts that, by 2022, more than 75% of global organizations will be running containerized applications in production, which is a significant increase from fewer than 30% in 2019.1
This makes Kubernetes the de facto standard for running containerized, cloud-native applications at scale, a critical part of the modern enterprise IT mix. CIOs and IT decision-makers recognize the importance of Kubernetes to enable developer productivity and help speed business innovation.
The future of IT is going to be about greater interactivity, seamless integrated experiences, predictive analytics, automation, decision making via machine learning, making sense of data exhaust, adding in augmented and virtual reality and a host of other applications we cannot even imagine yet. These applications will run most effectively when they are offered the greatest flexibility and agility. Container-based, cloud-native apps orchestrated by Kubernetes, offers those attributes to become the building blocks of the modern IT infrastructure. The future of IT requires a platform that supports all of this and that spans existing IT investments in data centers and clouds as well as embraces what is yet to come. This is why Red Hat champions an open hybrid cloud approach.
For a long time, the hybrid environments we saw customers using included bare-metal physical servers, virtual machines and private and public clouds (sometimes even multiple public clouds to meet specific needs). Customers want their hybrid environments to come together or, in other words, for the specific IT environment to be immaterial. They want their hybrid environments to be used as one, secured as one, managed as one and to interact as one - in short, they want a consistent and solid foundation. And, they want consistent ways to build and manage apps, regardless of which footprint they are on.
Enabling hybrid cloud application deployments is a key part of Kubernetes’ future, and our focus has been on putting together the various pieces that enable us to deliver the best possible platform. For example:
Increasingly, customers’ IT footprints also include edge environments. To truly enable hybrid cloud, enterprise Kubernetes must come to the network’s edge and customers must be able to as seamlessly manage edge sites as they can apps deployed on private or public clouds. As Red Hat CEO Paul Cormier has said, if edge computing is going to be a realistic future for enterprise IT, it needs the hybrid cloud AND open source to thrive. We are bringing OpenShift to the edge to enable this.
Customers want to deploy and manage modern apps across multiple clouds and clusters. This requires multi-cluster to become a first-class concept. Red Hat is making that a reality with Red Hat Advanced Cluster Management for Kubernetes, which we introduced during Red Hat Summit 2020 and will soon be generally available.
Modern apps come with new workloads, like artificial intelligence (AI), machine learning (ML), data services and more. Kubernetes is also serving as a linchpin in enabling these new workloads. One great example of these emerging workloads is our recently announced collaboration with Royal Bank of Canada (RBC), its AI research institute Borealis AI, and NVIDIA to transform customer banking and deliver intelligent apps on an AI infrastructure built on Red Hat OpenShift and NVIDIA’s DGX AI.
For these ground-breaking apps to make a difference, developers will need next-generation, cloud-native services like Istio, Knative, Tekton and more. It’s a blend of all of these characteristics that will help form a sustainable, usable and open future for Kubernetes. It’s also about delivering the best overall developer experience. Developers prefer the open source technologies that we have always been committed to. See the 2020 Stack Overflow Developer Survey: "Linux remains the most loved platform. Container technologies Docker and Kubernetes rank as the second and third most loved. They are also among the platforms that developers most want to learn, which demonstrates how beloved they are." With tools like the odo CLI for Kubernetes and OpenShift, Red Hat CodeReady Workspaces, Red Hat CodeReady Containers and more, we are focused on giving developers what they need to build in the cloud on these technologies.
It’s not just about greenfield container-native or cloud-native apps. Customers want to modernize traditional apps to cloud-native workloads. They want to eliminate the silos that typically exist between traditional and cloud-native application stacks by bringing standard VM-based workloads to Kubernetes. They want a platform where VMs, Linux containers and Windows containers coexist. All of this requires modern, consistent and simplified management that unites developer and operations teams on a single, fully-open, production-ready platform that can cover the entirety of an organization’s application estate. Red Hat is making container-native virtualization a reality with OpenShift Virtualization, which also was introduced during Red Hat Summit 2020 and is based on the open source Kubevirt project.
With OpenShift Virtualization, users can manage all of the components of their enterprise application stack - whether VMs, containers, or serverless functions - on the same platform. They will be able to modernize VMs by containerizing them - or not - on the same platform as their cloud-native apps. OpenShift Virtualization also includes full support for Windows VMs running older versions of Windows back to 2008 with the capability to refactor them over time to use Windows containers and Windows Server 2019 or be maintained purely as VMs.
Customers also want fully managed autonomous services across their hybrid cloud environments, which are enabled by Kubernetes Operators. With Operators, developers can effectively automate the management of a database, for example, with models for governing high availability, failover and other managed services-like activities that dev teams simply don’t want to deal with already built-in. We’ll soon provide an update on the momentum we’ve seen for Operators since Red Hat launched it as an open source project in 2018.
The future of Kubernetes, and indeed, of enterprise IT is to offer a smooth playing field, where legacy Java apps can communicate with Operator-run databases, serverless infrastructure code, C# apps in Windows containers, VMs running vendor apps. and modern microservices, and where all of this can all work together and exist in any cluster, anywhere from on-prem to cloud providers across the world.
Any app, anywhere on any infrastructure from the edge and bare metal to multiple clouds in a common, consistent manner: that’s the future Kubernetes can provide and it’s what Red Hat is focused on enabling for users worldwide.
About the author
Ashesh Badani is senior vice president of Cloud Platforms, responsible for leading Red Hat’s broad hybrid cloud portfolio, including product development and go-to-market strategy for Red Hat OpenShift, Red Hat OpenStack Platform, Red Hat Virtualization, Red Hat Cloud Suite, and Red Hat Cloud Infrastructure. In this role, Badani has helped to solidify Red Hat as a hybrid cloud and enterprise Kubernetes leader.