As the industry increasingly embraces the idea of cloud-native and containerized workloads, finer-grained architectures like microservices are becoming a model for building these applications. This approach enables a more incremental style of development that can accelerate innovation, increase flexibility, and support digital transformation; however, one of the more profound implications of this shift is that there are many more components in the application ecosystem that need to connect and exchange data with one another at different levels.
Integration has evolved as a way to meet this need for agile, lightweight, high performance connectivity. Red Hat Summit 2019 will feature dozens of sessions on integration, covering everything from APIs and events to messaging and streaming. The full list of sessions in the Integration & APIs track is available in the Summit session catalog.
Don't know where to start? Here are some sample sessions for consideration.
API Management: A Cornerstone in the Journey toward Open Government
As departments of the Government of Canada began their digital journeys, APIs became essential to transformation. Prior to the creation of an API store, many departments produced and used their own APIs independently without considering a mature, common management framework or standards. This created roadblocks to being able to fully realizing the value of the systems:
- Discoverability: With APIs scattered throughout government, some with little to no documentation, discovering valuable APIs is next to impossible.
- Security: Each system must implement its own API security layer, creating extra work for API publishers and consumers.
- API life-cycle management: Most systems provide little-to-no API life-cycle management capabilities, causing systems to break without warning or explanation.
- Performance: Most APIs are point-to-point and don’t offer any form of throttling, rate limiting, or caching.
- Process: Centralizing core systems, such as an API Management Platform, can cause process bottlenecks that impede adoption and cause publishers to find alternate solutions.
In this session, we'll look at how Innovation, Science, and Economic Development Canada (ISED) overcame these challenges at scale using Red Hat technologies to create a central, multitenant hub for departments to publish, better secure, and monitor their APIs while providing a single destination for departments and the public to discover, learn about, and subscribe to APIs to power their own innovations.
Apache Camel K: Bringing Serverless Workloads to the Enterprise
Serverless technologies offer a new way to deal with infrastructure in the cloud, with a different approach that affects both technical development and costs. Knative, an open source project started in mid-2018, is emerging as a foundation for adding serverless capabilities on top of Kubernetes and OpenShift. But companies still struggle to find a path for adopting such technologies and integrating them with their existing applications.
In this session, we’ll present Apache Camel K, a platform that brings Red Hat Fuse integration capabilities in the serverless world. Camel K allows Camel domain-specific language (DSL) code to run directly on top of any Knative-powered cluster, using serverless capabilities such as auto-scaling, scaling to zero, and event-based communication—and at the same time, connecting the serverless island with the outside.
With the help of a demo, we’ll show how Camel K makes it easier to connect cloud services or enterprise applications in the form of event source/sink for Knative eventing, using some of the 200+ Camel components available. We'll also show how it’s possible to intelligently route events inside the Knative environment though enterprise integration patterns.
Data Streaming with Apache Kafka using AMQ Streams
Apache Kafka is a leading data streaming platform, using its low-latency/high throughput, highly available broker technology to provide a foundation for event-based microservices and streaming analytics.
AMQ Streams is Red Hat’s enterprise-ready distribution of Apache Kafka, running on both RHEL and OpenShift Container Platform. AMQ Streams uses Operators to lower the barriers for organizations wanting to adopt data streaming technology without having to be experts in operating and managing Apache Kafka clusters; AMQ Streams treats Kafka related resources as first-class citizens offering developers an OpenShift-native Kafka experience.
In this session, we will show you how to use AMQ Streams to build your own Apache Kafka based data streaming applications and deploy them on OpenShift Container Platform.
Integration Patterns in a Serverless World: The Good Parts
Cloud native applications of the future should consist of hybrid workloads: stateful applications, batch jobs, microservices, and functions, wrapped as Linux containers and deployed via Kubernetes in the cloud. Functions and the so-called serverless computing model are the latest evolution of what started as SOA years ago. Today, serverless can free developers from infrastructure, orchestration, application platform layer, as well as software development lifecycle responsibilities. As with any new architectural style, it has made some old assumptions obsolete and brought new challenges to explore.
In this session, we explore the key challenges with function interactions and coordination, addressing these using classic integration patterns and modern approaches with Apache Camel to create next-generation distributed systems.
Re-imagining Business Automation: Convergence of Decisions, Workflow, AI/ML, RPI—Vision and Futures
Business process automation technology is emerging as a key to success in digital transformation projects. Automation is no longer just a set of business operations workflow items. A great, synergetic convergence of analytics, artificial intelligence or machine learning (AI/ML), robotic process automation (RPA) and low-code development is changing the marketplace. Businesses are recognizing the value of this unified view in automating business operations and decisions to improve the efficiency and agility of business operations. We'll explore:
- Exploiting data analytics to make well-informed decisions.
- Building intelligent, context-sensitive applications.
Interested in joining us for these sessions and more during Red Hat Summit 2019? Register now and use the code RHBLOG19 to receive $100 off your registration fees. (This code can only be used once per attendee, and cannot be combined with any other offer.)