We’ve previously outlined the role of service providers in edge technology innovation and how constructing a robust ecosystem of partners multiply the opportunities to maximize functional and business opportunities while mitigating risk and investment.
In order to support a broad variety of use cases spanning multiple industries, edge computing requires collaboration across suppliers, service providers and application and content partners. Additionally, with widely distributed networks and physical presence, they remain uniquely positioned to deploy edge computing infrastructures that are close to the user and tightly integrated with transport and access networks.
The explosion and permutations of end-points, mobile applications, and distributed computing drives this need — all while meeting demanding functionality and quality of service expectations.
How might the rapidly changing edge technology landscape benefit from the adaptability provided by open source solutions?
Where does the complexity originate?
The edge computing model continues to gain prominence as the demand for real-time processing and low-latency connectivity increases in the pursuit of acceptable quality of experience (QoE) solutions.
Edge computing is becoming progressively more important, and in time, indispensable as part of a hybrid computing model—with the centralized cloud persisting in its role as the means to store and process information.
Not only are edge data volumes and performance demands increasing, they will accelerate, and are a direct function of the emergent and value-added applications themselves.
Service providers are aware of the new reality in networking and computing architecture evolutions. In fact, they have often paced the industry in affecting these changes and promoting wider adoption.
Following the proven success of Network Functions Virtualization (NFV), their capitalization of shifts and greater efficiencies from the currently dominant architectures were also inevitable. Elements driving edge computing complexity from the service provider perspective are outlined below.
Applications and devices
Of particular interest are applications subject to both latency and data processing considerations, which would most likely benefit from edge computing implementations, including:
Virtual reality (VR) and Augmented reality (AR) – Overcoming bandwidth and latency challenges to retain immersion in the application is crucial to QoE.
Internet of Things (IoT) devices - With many potential data exchanges between receiving and resolving a request, increased computing power on the device itself, or closer to it in the network, can help better the user experience.
Hardware configuration and architectures
Those which support new architectural frameworks expanding capabilities in 5G and IoT will be important focus areas.
IoT applications and devices – Applications and devices are creating requirements for cloud-based infrastructures at the edge due to real-time processing, which optimizes their utility. Devices are growing more powerful in their own right, driven by the need to keep processing as close to data sources as possible, gaining speed and reducing latency, especially those using location-based information.
5G support - From the architectural stance, 5G can create a wide range of impacts on service deliveries, including new functionalities at the network edge:
The introduction of virtualized RANs (vRANs) to the network virtualizing control for multiple radio infrastructures. Radio controls will be supported in pools of cloud-based, software-driven resources at locations in 5G network edges.
Deploying networks using the architectural framework known as control and user plane separation (CUPS), which takes advantage of the agility and elasticity of virtualization.
Service providers fulfill multiple roles
Service providers are embracing a dual-existence technical and business model as suppliers of proven network infrastructure to the enterprise and providers of end-user services—some directly to consumers and with ecosystem partners.
In addition to adopting cloud frameworks for their network and application delivery infrastructures—anchoring them to reference architectures of NFV in many important use cases—service providers have been aggressive in exploring new classes of service at the edge to take advantage of their proximity to customers, and as demand for in-place services increases.
Can a single vendor provide a complete end-to-end edge computing solution?
Although technical solutions are able to keep pace with the challenges at present, there is no way to predict the progression of future innovations, or what solutions will arise to address them.
Since edge devices come in all forms and serve all sorts of functions—there is no standard model or platform. The rise and variety of open source solutions at the edge is happening by necessity as no two edge networks and device clusters are alike, requiring high degrees of customization.
As a result, open source should play a critical role in supporting and powering edge devices. Additionally, partner ecosystem development is important, especially to amortize service provider infrastructure investments and introduce innovation.
These factors contribute to the need to facilitate solutions from a position of flexibility, and standardized operational environments, so that a myriad of operators can provide their own value-add services, seamlessly.
While there might have been announcements of edge computing solutions targeting enterprise and service providers for use cases including AI/ML, IoT, etc., these have been based on proprietary configurations, limiting customer choice and future adaptability.
Because no single vendor has the ability to claim a complete end-to-end edge computing solution, the continued ascendance and use of open source solutions is well suited to encourage open and interoperable solutions now and in the future.
This will help service providers become enablers of edge computing advanced applications, whether the resulting service is provided by them alone, or is a composite of multiple operators.
Industry support of open source and edge computing
Data collected from Red Hat’s The State of Enterprise Open Source report provides compelling support for open source implementation in general, and edge computing deployment in particular:
90% of IT leaders surveyed are using enterprise open source today, and they’re using it for:
IT infrastructure modernization (64%).
Application development (54%).
Digital transformation (53%).
69% of respondents prefer to use multiple vendors for their cloud infrastructure needs, suggesting preference for infrastructure spanning multiple providers rather than being limited to a single one
Respondents cited some top benefits of using enterprise open source as:
Higher quality software (35%).
Access to latest innovations (33%).
Better security (30%).
Ability to safely leverage open source technologies (30%).
In the two most prevalent emerging technology areas, edge computing/IoT and artificial intelligence/machine learning (AI/ML), use of enterprise open source is expected to significantly outpace proprietary software over the same period.
In edge computing/IoT, enterprise open source is expected to increase from 55% of cases to 72% two years from now.
For AI/ML, it was revealed that proprietary software use is expected to decrease, while enterprise open source use shoots up from 48% to 65%.
81% of IT leaders responded that open source “provides flexibility to customize solutions to meet company needs,” and 79% expect that over the next two years, their organization will increase use of enterprise open source software for emerging technologies.
We observed shifting priorities and needs, from lower cost of ownership to innovation. In the past, lower cost of ownership was cited as the top benefit of enterprise open source, but at present it’s fallen dramatically below “access to the latest innovations,” which is second.
A wide range of solutions for flexible deployment configurations
Open source implementation for edge computing solutions aligns directly with Red Hat’s ultimate vision: consistent operator and end-user experience across any workload, any footprint and any location.
Combined with our participation in upstream open source communities like Kubernetes, OpenStack, and Fedora, Red Hat’s evolving portfolio of products is well positioned to serve customers and partner opportunities in the edge computing market and can help enable new edge use cases.
Red Hat Enterprise Linux (RHEL) provides the foundation for edge computing solutions—from the datacenter to the device edge. RHEL enables multiple connectivity options for edge devices and hardware accelerators like GPUs needed for edge analytics.
Red Hat OpenShift Container Platform can be implemented on bare metal solutions for edge use cases and optimized for container-centric, high-performance, small-footprint clusters or single-node instances. OpenShift Container Platform is currently contributing the core functions required to deliver the edge computing infrastructure and its management at large scale to the upstream open source projects.
Red Hat OpenStack Platform for edge use cases requiring (virtual or bare metal) Infrastructure-as-a-Service via industry-standard APIs. Red Hat OpenStack Platform features support for distributed compute nodes, creating a highly scalable architecture for the most challenging virtual machine (VM) workloads (e.g., NFV and high performance computing) as well as containerized workloads and Platform-as-a-Service supported by running Red Hat OpenShift-on-OpenStack.
Hyperconverged infrastructure deployment architecture combines Red Hat’s key infrastructure components from our portfolio that can be used to enable various edge use cases requiring small-footprint, highly-available standalone clusters—clusters that integrate control, storage and compute in a single operational footprint.
Red Hat Hyperconverged Infrastructure for Virtualization (RHHI-V) is one of the commercial hyperconverged infrastructure (HCI) solutions for customers needing such a standalone cluster. Red Hat continues to invest to stay ahead of customers' evolving needs for unified orchestration of VM and container workloads, and simplified operational experience with future hyperconverged solutions built on top of Red Hat OpenShift and Red Hat Ceph Storage.
Red Hat Ceph Storage provides open, massively scalable storage solutions for modern workloads like cloud infrastructure, data analytics, media repositories, and backup and restore systems.
Open source and the hybrid cloud are the building blocks of modern telecommunications components and architectures and are especially well suited for in-place and emerging standards.
Delivering an end-to-end edge computing solution is not a single-vendor job. The capability to accommodate interoperable, flexible solutions presently and in the future is deeply important. By embracing an open source foundation supporting advanced edge computing applications, especially in collaboration with others, service providers can foster the new and more capable ecosystems to drive greater efficiency and accelerate the pace of innovation
Today's use cases might have been unimaginable a decade ago (let alone those which might be needed in the decade ahead). Service providers need to retain and construct the ability to operate adaptable systems and platforms in both enterprise and end-user markets.
This is especially true as the telecommunications industry speeds toward technologies such as 5G, with service providers looking at competitive advantages through new technologies that will enable them to deliver more capabilities to their customers more rapidly. We invite you to explore how Red Hat is helping customers and partners construct open and agile edge computing platforms.