Organizations are seeing the value in Apache Kafka and looking to use it to help process event data as part of intelligent cloud-native applications. Many Red Hat customers are using operators to simplify the deployment, configuration and management of Kafka on Red Hat OpenShift using the Red Hat AMQ streams component. Today, we are taking a step forward to further reduce the work required to stand up Kafka and integrate it into their environments with the introduction of Red Hat OpenShift Streams for Apache Kafka, a standalone cluster fully managed by Red Hat that provides Kafka without the maintenance burden.

A Red Hat foi reconhecida como líder no relatório Magic Quadrant™ do Gartner® de 2023

A capacidade de execução e a visão abrangente da Red Hat garantiram à empresa o primeiro lugar na categoria Gestão de Containers do relatório Magic Quadrant™ 2023, do Gartner.

The tl;dr on Apache Kafka

Maybe you haven't heard of Kafka, or perhaps you have but in the industry way where people talk up a technology without actually explaining what it does or how it's used. So here's a short take on Kafka, and you'll find a longer one in this article on RedHat.com.

Kafka is an open source platform, originally developed by LinkedIn and now homed at the Apache Software Foundation, that helps move and consume large amounts of data. It connects with a wide variety of data sources like PostgreSQL or AWS S3 and can be used to process event streams from many different programming languages. This goes far beyond batch processing and takes us into the next generation of messaging. 

Kafka tends to be used for two major types of activities, data analytics or processing streaming events. For example, some of the data analytics jobs include use in payment processing, analyzing sensor data from IoT devices, in social media applications or to deal with healthcare data and the list goes on and on and on. When you have non-stop, streaming data starting at point A and you need to move it to point B quickly to make better use of the data, that's when you need Kafka. 

Historically applications would be written to rely on batch data, which introduces delays and frustration from customers or internal stakeholders. Using Kafka to deliver streaming data can deliver a better and more immediate result.

 

Likewise, Kafka can help organizations with processing streaming events. Organizations have moved away from monolithic applications to loosely coupled microservices. Kafka, along with APIs, can help teams consume microservices and allows for more agile development practices. 

A distributed, event-driven architecture requires a "backbone" that captures, communicates and helps process events. Kafka is right at home serving as the communication backbone that connects your data sources and events to applications. 

Kafka is equally at home helping to provide data streaming to applications that need access to data as it happens, to provide analytics on data as it moves through the stream. 

Sounds pretty valuable, right? Absolutely, but the grunt work of installing, managing and maintaining Kafka is not where the value lies. The value comes in having Apache Kafka up and running, integrated with your Kubernetes environment and ready to use. 

Apache Kafka as a part of your development and deployment toolbox

With Red Hat OpenShift Streams for Apache Kafka, we handle the infrastructure, uptime and upgrades so that organizations can focus on building and scaling their applications. You get 24x7 coverage, a 99.95% uptime SLA, metrics, monitoring and much more. This can save your teams a great deal of time that they can focus on building applications and adding value to the business.

Developer experience was a top priority when we designed and built OpenShift Streams. It provides a streamlined developer experience for building, deploying and scaling real-time applications in hybrid-cloud environments

Benefits of OpenShift Streams for Apache Kafka

Currently in development preview, OpenShift Streams for Apache Kafka provides consistent operations across distributed microservices, large data transfer volumes and managed operations. Administrators can focus on other tasks while Red Hat handles the infrastructure and developers can self-provision Kafka resources to give them independence and efficiency.

OpenShift Streams for Apache Kafka is fully managed by Red Hat Site Reliability Engineers. As with OpenShift managed services and products, daily operations, like logging and upgrades, are proactively addressed behind the scenes. 

While OpenShift Streams for Apache Kafka doesn't require an organization to use OpenShift, it fits in beautifully with building apps on OpenShift. Red Hat provides service binding Operators so developers can connect OpenShift workloads to Kafka topics with ease. 

Developers get Kafka at their fingertips to create, discover and connect to real-time data streams wherever they're deployed. This makes it much easier for them to connect loosely-coupled microservices running in OpenShift, to deliver real-time experiences to users and to build data analytics applications. They can make use of the web-based UI on cloud.redhat.com, a REST API or the CLI interface.

Naturally we plan to tie OpenShift Streams for Apache Kafka into a schema registry, so teams can easily discover and connect to streaming data topics from other teams — and publish their own so they can be consumed by other developers in your organization. 

OpenShift Streams for Apache Kafka can also be combined with other managed cloud services like Red Hat OpenShift API Management (now GA) and Red Hat OpenShift Data Science (also in development preview).

Get to know OpenShift Streams for Apache Kafka 

At this point you're probably wondering, "how do I get my hands on this?" If you're already an OpenShift customer, you can try it today at Try Kafka. You can also try the service by registering on cloud.redhat.com.

OpenShift Streams for Apache Kafka is currently in development preview. Keep an eye out for announcements about additional availability. 


Sobre o autor

Red Hat is the world’s leading provider of enterprise open source software solutions, using a community-powered approach to deliver reliable and high-performing Linux, hybrid cloud, container, and Kubernetes technologies.

Read full bio