Serverless computing continues to be a much-discussed technology, in large part due to the perception that it enables organizations to focus more clearly on their applications, instead of the underlying infrastructure. More specifically, serverless “refers to the concept of building and running applications that do not require server management. It describes a finer-grained deployment model where applications and microservices, bundled as functions or not, are uploaded to a platform and then executed, scaled, and billed in response to the exact demand needed at the moment,” as defined by the Cloud Native Computing Foundation (CNCF) community.

Today marks a milestone for serverless, specifically Knative. While Knative serverless primitives have been available since July 2018 to give users a more portable way to write serverless applications, today marks a commitment to making the technology enterprise-ready. Technology leaders including Red Hat, Google, SAP and IBM, are committing to leveraging Knative as an open standard in their commercial serverless offerings. This common set of constructs in Knative enables interoperability for serverless workloads on Kubernetes installations. Coupled with Red Hat OpenShift, Knative can further enable portability of operations in hybrid environments.

Red Hat OpenShift - our industry-leading enterprise Kubernetes platform - plans to add support for Knative, in dev preview early next year. This is designed to enable users to build and run serverless applications, integrating with Red Hat OpenShift Service Mesh, based on the Istio and Kiali projects. We are also using Strimzi, which makes it easier to run Apache Kafka on OpenShift or Kubernetes, via Red Hat AMQ Streams for reliable eventing and Camel-K, a lightweight integration framework built from Apache Camel that enables multiple event sources to be used as triggers for serverless applications. Today, users will be able to get started with Knative on minishift by reproducing our demo.  

Looking ahead, Red Hat plans to take this a step further to enable consistent operations across any cloud provider for Knative-based serverless applications spanning the ecosystem of ISVs with Certified OpenShift Services.

Google Cloud introduced Knative, an open source project that builds on top of the common platform capabilities of Kubernetes, including monitoring, logging, identity management, and security, earlier this year. Knative provides a set of middleware components for building modern, source-centric, container-based applications that can run on-premises or in the cloud. The model enables developers to write and manage serverless applications in a consistent way and is both developer and operator-friendly. Red Hat is among the major vendors who have been using its expertise in enterprise Kubernetes into those building blocks making sure that they are well suited to developer needs. Knative is also compatible with other emerging technologies, such as Istio.

Red Hat is committed to providing support for hybrid application development and operations and that now also includes serverless workloads. Using the Operator Lifecycle Manager and the Operator Framework, the way you deploy, manage and upgrade serverless platforms, event sources and applications can be consistent across any cloud provider, as well as on-premises. This not only provides consistency but it also enables interoperability.

At this week’s KubeCon, we are discussing how Red Hat is working to bring Knative CRDs to developers, enabling ISVs and their services to be consumed by applications and functions that are written in an opinionated model. Simply put, this model can enable event-driven applications to inherit serverless traits, including scale-from-zero, on-demand events following industry best practices for modern, cloud-native application development. This keeps code and configuration separate, and allows for disposable resources, provisioning and a pluggable build model. For OpenShift, it means that developers can reuse toolings that they are already familiar with such as OpenShift builds, source-to-image (S2I) and Buildah.  

Along with Google Cloud and other key enterprise players, Red Hat intends to take the interoperability of serverless workloads to the next level. Between OpenShift and Knative, other containerized applications can benefit from the serverless, hybrid cloud.

“Introduced in July, Knative is an open source project based on Kubernetes that provides critical building blocks for modern serverless applications,” said Oren Teich, Director of Product Management, Google Cloud. “In the four months since then we have seen tremendous enthusiasm from contributing companies, many of whom are furthering Knative as the serverless standard for hybrid and multi-cloud users. Red Hat has been a contributor to Knative from the start, and the integration of Knative in Openshift is a key outcome of our joint open source efforts.”

For more info, see Google’s blog. For more on our work with Knative and serverless, please see the first post in our blog series on Red Hat OpenShift, Knative: Serving your Serverless Services.


About the author

William Markito Oliveira is an energetic and passionate product leader with expertise in software engineering and distributed systems. He leads a group of product managers working on innovative and emerging technologies.

Read full bio