Select a language
The IT world we know today is currently going through a phase of decentralization: computation is moving closer to where the data is generated. This means gathering and processing data closer to application, also known as edge computing.
In this new world, devices and services are managed outside the traditional management sphere: platforms are pushed outside the data center, devices are spread across huge areas in inaccessible locations and applications run on demand closer to the data. Companies are now forced to address three main challenges:
- How do you ensure you have the skills to address the challenges of your edge situation?
- How do you build capabilities that can react without human interaction in a secure and trusted way?
- How do you scale at the edge when you suddenly have huge numbers of devices and endpoints to consider?
Connect through a unified automation language
To address the unique challenges of your edge situation in an aggressively automated environment, there needs to be a unified automation language that is able to speak natively to the world out there. The ability to integrate across the ecosystems enables you to tie the edge back into your data center. A shared language between domain experts and application developers brings forth cooperation on the same code, allowing them to combine their expertise. The ability for them to work together can help build streamlined automation across storage, networking, infrastructure, cloud and their entire IT security. This architecture paves the way to react to business needs in a more reliable way.
With Red Hat Ansible Automation Platform built around such a language today, you can connect your assets at the edge with your core platforms in an automated fashion at large scale. A broad community of users, customers and partners are extending Ansible Automation Platform every day, writing automation content that you can use to build more controlled, more trusted and more secure solutions.
Event-driven automation for the cloud with Ansible and Knative
But to excel at the edge, there also needs to be a designated solution to automatically transform events into actions. That’s where event-driven automation comes in to enable applications to be decoupled and reactivated, scaling only when an event happens. Two key technologies that are enabling and democratizing serverless architectures beyond the cloud providers are Knative and Cloud Events.
Knative enables almost any containerized application in Kubernetes to behave as serverless through two key Knative modules: serving (auto-scaling behavior based on the request) and eventing (providing an infrastructure to send and receive events in a standard way).
Cloud Events creates consistency between the different cloud providers and on-premise systems. The CNCF Serverless Working Group created this solution by collaborating with a number of vendors like Google, Microsoft, Red Hat and SAP, along with end user companies. Cloud Events is a specification to describe events in a common way, with the goal to enable consistency, accessibility and portability for cloud native applications.
Combining a container platform with an automation platform
But how does this tie together with Ansible Automation Platform? Where does Ansible live in this vision? The answer is simple: wherever there is a need to integrate automation into anything outside of Red Hat OpenShift Container Platform, Ansible can excel. Knative is really good at listening to events, while Ansible is really good at automating everything, uniting two platform strengths to solve business problems. Now, Red Hat customers can more easily automate and orchestrate their IT infrastructure from compute to network, storage, cloud and beyond. Creating simple, user-friendly building blocks to combine these platforms opens up all kinds of possibilities for use cases.
How does it work?
Each time a corresponding event hits Cloud Events, Knative containers can be spun up, and a programmatic API request can be sent to Ansible Automation Platform, which automates infrastructure, services or IT at the edge that are outside of the native container environment.
For example, imagine we get an event from our ticket system to change a climate system to a certain temperature. Ansible Automation Platform runs the required playbook and module and executes the necessary actions against the thermostats in question. It may also execute against other cloud APIs, third party services or systems that are not containerized, like a security workflow and networking endpoints. To learn more, check out Red Hat OpenShift Serverless and Knative.
End-to-end management at the edge with the Red Hat portfolio
Edge computing is an integral part of Red Hat’s open hybrid cloud strategy. It extends the goal of providing a consistent experience for everyone from the app developer to the infrastructure ops team charged with deployment. Red Hat Enterprise Linux is already the keystone for Red Hat’s entire hybrid cloud portfolio, with edge being no different. With the intelligence of Red Hat Insights built-in to the platform, Red Hat Enterprise Linux is able to help IT operations teams identify, analyze and remediate potential issues at scale before they blossom into full-fledged problems.
Building on top of Linux is cloud-native infrastructure, most commonly Kubernetes, and for more than 2,000 enterprises globally, this is synonymous with Red Hat OpenShift. As Red Hat OpenShift runs these apps, Red Hat Advanced Cluster Management for Kubernetes helps contribute to making highly scaled-out edge architectures more secure, consistent and compliant as standard datacenter deployments, all from a single point of control. Learn more why cluster management is critical to edge computing.
As edge computing continues to rise, companies will continue to feel tasked with managing outside of their comfort zone. Implementing solutions that can address these new challenges while setting them up to focus on innovations on the horizon can set organizations apart in an automated future.