Traditionally, enterprise organizations operate using data-centric integration approaches to connect multiple systems, services and applications. This approach is ideal for maintaining and improving data consistency and integrity across all systems and applications. It also allows moving large amounts of data, connecting to disparate systems, and creating master data fabrics that accommodate multiple data management scenarios. 

As enterprise organizations build for digital experiences, they face new challenges for deploying and scaling distributed systems that can ingest large amounts of data at a fast pace. The expectation is for developer teams to rapidly respond to change by adapting business applications, delivering data faster and scaling up and down based on user demands. However, the traditional integration approaches cannot support all the new scenarios and requirements that have emerged from deploying digital experiences.

Event-driven architecture is an integration approach that has gained popularity for solving newer data challenges coming from the need of enterprise organizations to deliver improved digital experiences. This integration approach is based on moving events between applications traveling unidirectionally from source to destination with the goal of creating and delivering consistent streams of events. There are many technologies and tools used to support the event-driven approach that improve resiliency by reducing dependencies between applications and reduces the load on the entire integration system by filtering the important data or events.

Apache Kafka is one of the best solutions in the market to deal with event-driven integration. It is a proven open source technology designed for capturing, ingesting and streaming large amounts of events with low overhead providing the capability to deliver data in near real-time.

What is event-driven architecture?

Event-driven architecture is a software architectural paradigm that captures, detects and consumes events. The event-driven approach has caught the attention of enterprise organizations due to the ability to improve integration scalability, flexibility and durability for enterprise applications. 

An event is any significant occurrence or change in state for system hardware or software. An event is not the same as an event notification. The source of an event can be from internal or external inputs generated by applications, sensors and systems, among others. 

Event-driven solutions allow for collecting, publishing and consuming events for processing. Processing those events could produce a reaction or trigger a notification to an application or user. 

One of the benefits of event-driven architecture is that events are published to an application and only consumers that care about that event will ingest it and process it. The application that creates the event has no idea who will collect and process the event. This interaction is called decoupling, which allows for applications to be independent and focus on their particular tasks. 

The event-driven architecture approach is a great fit for newer digital applications that require capturing events coming from a variety of data sources at a faster pace than before. Integration centric approach cannot provide the same level of decoupling and resiliency, because it focuses on centralizing the data to one point of failure. Another approach that is also very common is data centric integration, which is commonly used for data management solutions where the attention is centered on storing data not moving data. 

The value of Apache Kafka

Apache Kafka is a distributed data streaming platform that uses the publish/subscribe method to move streams of events between microservices, cloud-native or traditional applications and other systems.

The product has been designed to handle publishing, subscribing to, storing and processing event streams from multiple sources and to process events in parallel allowing for near-real time communication. The captured events are streamed through brokers allowing for decoupling client-service communications. This makes Apache Kafka a popular choice for event-driven integrations.  

The Red Hat approach

Red Hat has been committed to deploy and support Apache Kafka in enterprise environments since 2018 by providing support for Kubernetes and creating an ecosystem of services around it. Red Hat’s enterprise-ready Kubernetes container platform, Red Hat OpenShift, provides a consistent application platform to build, deploy and run applications across the hybrid cloud. Deploying Apache Kafka on Kubernetes allows event-driven applications to be automated, scaled and deployed anywhere. 

Red Hat provides Apache Kafka with a few different deployment options serving multiple use cases. Red Hat AMQ Streams is a scalable, distributed and high-performance data streaming platform based on the Apache Kafka project. The solution has been designed to simplify the deployment, configuration, management and use of Apache Kafka on OpenShift. 

For those enterprise organizations that prefer to focus on its core competencies and want to worry less about infrastructure, Red Hat offers an opinionated service that makes it fast and easy to get started with Kafka-driven EDA. Red Hat OpenShift Streams for Apache Kafka is a managed cloud service designed to support developers on incorporating streaming data into applications to deliver real-time experiences.

We invite you to check out our webinar series “Understanding Kafka in the Enterprise” or visit the Red Hat OpenShift Streams for Apache Kafka page to learn more about the different use cases.


About the author

Jennifer Vargas is a marketer — with previous experience in consulting and sales — who enjoys solving business and technical challenges that seem disconnected at first. In the last five years, she has been working in Red Hat as a product marketing manager supporting the launch of a new set of cloud services. Her areas of expertise are AI/ML, IoT, Integration and Mobile Solutions.

Read full bio