Edge computing is computing that takes place at or near the physical location of either the user or the source of the data, which results in lower latency and saves bandwidth.
In a cloud computing model, compute resources and services are often centralized at large datacenters, which are accessed by end users at the edge of a network. This model has proven cost advantages and more efficient resource sharing capabilities. However, new forms of end-user experiences like IoT need compute power closer to where a physical device or data source actually exists, i.e. at the network’s "edge."
By placing computing services closer to these locations, users benefit from faster, more reliable services with better user experiences, while companies benefit by being better able to process data, support latency-sensitive applications, and use technologies like AI/ML analysis to identify trends and offer better products and services.
Edge devices are physical hardware located in remote locations at the edge of the network with enough memory, processing power, and computing resources to collect data, process that data, and execute upon it in almost real-time with limited help from other parts of the network.
Edge devices require some kind of network connectivity to facilitate back-and-forth communication between the device and a database at a centralized location. An edge device is where the data is collected and processed.
Edge computing can complement a hybrid computing model, specifically where centralized computing is used for:
- compute intensive workloads
- data aggregation and storage
- artificial intelligence/machine learning
- coordinating operations across geographies
- traditional back end processing
- autonomous vehicles
- augmented reality/virtual reality
- smart cities
Edge computing can also help solve problems at the data source, in near real time. In short, if reduced latency and/or real-time monitoring can support business goals, there is a use case for edge computing.
The Internet of Things (IoT)
For an IoT device, there can be a lot of network steps in between receiving and resolving a request. The more compute power available on the device itself, or at least closer to it in the network, the better the user experience.
When problems arise in mobile computing, they often revolve around latency issues and service failures. Edge computing can help solve for stringent latency constraints by reducing signal propagation delays. Additionally, it can limit service failures to a smaller area or user population, or provide a degree of service continuity despite intermittent network connectivity.
As service providers modernize their networks, they are moving workloads and services out of the core network (in datacenters) towards the network’s edge: around points of presence and central offices. With the virtualization of central offices, one of the last physical interfaces for service delivery, service providers can reach the goal of deploying services at the network edge.
IoT produces a large amount of data that needs to be processed and analyzed so it can be used. Edge computing moves computing services closer to the end user or the source of the data, such as an IoT device.
Edge computing is a local source of processing and storage for the data and computing needs of IoT devices, which reduces the latency of communication between IoT devices and the central IT networks those devices are connected to.
Edge computing allows you to benefit from the large amount of data created by connected IoT devices. Deploying analytics algorithms and machine learning models to the edge enables data processing to happen locally and be used for rapid decision making.
IIoT stands for Industrial Internet of Things, a term for connected devices in manufacturing, energy, and other industrial practices. IIoT devices are often deployed in connection with edge computing. IIoT is significant for bringing more automation and self-monitoring to industrial machines, helping improve efficiency.
Multi-access edge computing (MEC) is a type of network architecture that provides cloud computing capabilities and an IT service environment at the edge of the network. The goal of MEC is to reduce latency, ensure highly efficient network operation and service delivery, and improve the customer experience.
Multi-access edge computing is now more broadly defined as an evolution in cloud computing that uses mobility, cloud technologies, and edge computing to move application hosts away from a centralized datacenter to the edge of the network, which results in applications that are closer to end users and computing services that are closer to the data created by applications.
5G refers to the fifth generation of mobile networks, representing upgrades in bandwidth and latency that enable services that weren’t possible under older networks. 5G networks promise gigabit speeds—or data transmission speeds of up to 10 Gbps. 5G service also vastly reduces latency and can expand coverage to remote areas.
5G can be considered a use case for edge computing, and it also enables other edge use cases. Edge computing is a way to meet the performance and low latency requirements of 5G networks and improve the customer experience.
Adopting edge computing is a high priority for many telco service providers as they modernize their networks and seek new sources of revenue. Specifically, many service providers are moving workloads and services out of the core network (in datacenters) toward the network’s edge, to points of presence and central offices.
For telcos, the apps and services their customers want to consume on edge networks are the key to revenue generation, but success depends on building the right ecosystem and coordinating among stakeholders and technology partners alike.
No single vendor can provide a complete edge computing solution. Instead, you will assemble a solution from multiple components. Open source platforms ensure interoperability across a wide ecosystem, without the vendor lock-in of a proprietary technology stack. And to enable new edge computing use cases, Red Hat is investing in upstream open source communities like Kubernetes, OpenStack, and Fedora IoT.