Red Hat 계정으로 회원 프로필, 기본 설정 및 고객 상태에 따라 다음의 서비스에 액세스할 수 있습니다.
아직 등록하지 않으셨습니까? 등록해야 하는 이유:
- 한 곳에서 기술 자료 문서를 탐색하고, 지원 사례와 서브스크립션을 관리하고, 업데이트를 다운로드 할 수 있습니다.
- 조직 내의 사용자를 보고, 계정 정보, 기본 설정 및 권한을 편집할 수 있습니다.
- Red Hat 자격증을 관리하고 시험 내역을 조회하며 자격증 관련 로고 및 문서를 다운로드할 수 있습니다.
Red Hat 계정으로 회원 프로필, 기본 설정 및 자신의 고객 상태에 따른 기타 서비스에 액세스할 수 있습니다.
보안을 위해, 공용 컴퓨터 사용 중에 Red Hat 서비스 이용이 끝난 경우 로그아웃하는 것을 잊지 마십시오.로그아웃
Traditionally, enterprise organizations operate using data-centric integration approaches to connect multiple systems, services and applications. This approach is ideal for maintaining and improving data consistency and integrity across all systems and applications. It also allows moving large amounts of data, connecting to disparate systems, and creating master data fabrics that accommodate multiple data management scenarios.
As enterprise organizations build for digital experiences, they face new challenges for deploying and scaling distributed systems that can ingest large amounts of data at a fast pace. The expectation is for developer teams to rapidly respond to change by adapting business applications, delivering data faster and scaling up and down based on user demands. However, the traditional integration approaches cannot support all the new scenarios and requirements that have emerged from deploying digital experiences.
Event-driven architecture is an integration approach that has gained popularity for solving newer data challenges coming from the need of enterprise organizations to deliver improved digital experiences. This integration approach is based on moving events between applications traveling unidirectionally from source to destination with the goal of creating and delivering consistent streams of events. There are many technologies and tools used to support the event-driven approach that improve resiliency by reducing dependencies between applications and reduces the load on the entire integration system by filtering the important data or events.
Apache Kafka is one of the best solutions in the market to deal with event-driven integration. It is a proven open source technology designed for capturing, ingesting and streaming large amounts of events with low overhead providing the capability to deliver data in near real-time.
What is event-driven architecture?
Event-driven architecture is a software architectural paradigm that captures, detects and consumes events. The event-driven approach has caught the attention of enterprise organizations due to the ability to improve integration scalability, flexibility and durability for enterprise applications.
An event is any significant occurrence or change in state for system hardware or software. An event is not the same as an event notification. The source of an event can be from internal or external inputs generated by applications, sensors and systems, among others.
Event-driven solutions allow for collecting, publishing and consuming events for processing. Processing those events could produce a reaction or trigger a notification to an application or user.
One of the benefits of event-driven architecture is that events are published to an application and only consumers that care about that event will ingest it and process it. The application that creates the event has no idea who will collect and process the event. This interaction is called decoupling, which allows for applications to be independent and focus on their particular tasks.
The event-driven architecture approach is a great fit for newer digital applications that require capturing events coming from a variety of data sources at a faster pace than before. Integration centric approach cannot provide the same level of decoupling and resiliency, because it focuses on centralizing the data to one point of failure. Another approach that is also very common is data centric integration, which is commonly used for data management solutions where the attention is centered on storing data not moving data.
The value of Apache Kafka
Apache Kafka is a distributed data streaming platform that uses the publish/subscribe method to move streams of events between microservices, cloud-native or traditional applications and other systems.
The product has been designed to handle publishing, subscribing to, storing and processing event streams from multiple sources and to process events in parallel allowing for near-real time communication. The captured events are streamed through brokers allowing for decoupling client-service communications. This makes Apache Kafka a popular choice for event-driven integrations.
The Red Hat approach
Red Hat has been committed to deploy and support Apache Kafka in enterprise environments since 2018 by providing support for Kubernetes and creating an ecosystem of services around it. Red Hat’s enterprise-ready Kubernetes container platform, Red Hat OpenShift, provides a consistent application platform to build, deploy and run applications across the hybrid cloud. Deploying Apache Kafka on Kubernetes allows event-driven applications to be automated, scaled and deployed anywhere.
Red Hat provides Apache Kafka with a few different deployment options serving multiple use cases. Red Hat AMQ Streams is a scalable, distributed and high-performance data streaming platform based on the Apache Kafka project. The solution has been designed to simplify the deployment, configuration, management and use of Apache Kafka on OpenShift.
For those enterprise organizations that prefer to focus on its core competencies and want to worry less about infrastructure, Red Hat offers an opinionated service that makes it fast and easy to get started with Kafka-driven EDA. Red Hat OpenShift Streams for Apache Kafka is a managed cloud service designed to support developers on incorporating streaming data into applications to deliver real-time experiences.
About the author
Jennifer Vargas is a marketer — with previous experience in consulting and sales — who enjoys solving business and technical challenges that seem disconnected at first. In the last five years, she has been working in Red Hat as a product marketing manager supporting the launch of a new set of cloud services. Her areas of expertise are AI/ML, IoT, Integration and Mobile Solutions.