In this post:

  • Get an overview of edge computing and how it differs from cloud computing.
  • We break down four benefits and and provide various scenarios and use cases of edge computing.

From telecommunications networks to the manufacturing floor, through financial services to autonomous vehicles and beyond, computers are everywhere these days, generating a growing tsunami of data that needs to be captured, stored, processed and analyzed. 

At Red Hat, we see edge computing as an opportunity to extend the open hybrid cloud all the way to data sources and end users. Where data has traditionally lived in the datacenter or cloud, there are benefits and innovations that can be realized by processing the data these devices generate closer to where it is produced.

This is where edge computing comes in.

What is edge computing?

Edge computing is a distributed computing model in which data is captured, stored, processed and analyzed at or near the physical location where it is created. By pushing computing out closer to these locations, users benefit from faster, more reliable services while companies benefit from the flexibility and scalability of hybrid cloud computing.

Edge computing vs. cloud computing

A cloud is an IT environment that abstracts, pools and shares IT resources across a network. An edge is a computing location at the edge of a network, along with the hardware and software at those physical locations. Cloud computing is the act of running workloads within clouds, while edge computing is the act of running workloads on edge devices.

You can read more about cloud versus edge here.

4 benefits of edge computing

As the number of computing devices has grown, our networks simply haven’t kept pace with the demand, causing applications to be slower and/or more expensive to host centrally.

Pushing computing out to the edge helps reduce many of the issues and costs related to network latency and bandwidth, while also enabling new types of applications that were previously impractical or impossible.

1. Improve performance

When applications and data are hosted on centralized datacenters and accessed via the internet, speed and performance can suffer from slow network connections. By moving things out to the edge, network-related performance and availability issues are reduced, although not entirely eliminated.

2. Place applications where they make the most sense

By processing data closer to where it’s generated, insights can be gained more quickly and response times reduced drastically. This is particularly true for locations that may have intermittent connectivity, including geographically remote offices and on vehicles such as ships, trains and airplanes.

3. Simplify meeting regulatory and compliance requirements

Different situations and locations often have different privacy, data residency, and localization requirements, which can be extremely complicated to manage through centralized data processing and storage, such as in datacenters or the cloud.

With edge computing, however, data can be collected, stored, processed, managed and even scrubbed in-place, making it much easier to meet different locales’ regulatory and compliance requirements. For example, edge computing can be used to strip personally identifiable information (PII) or faces from video before being sent back to the datacenter.

4. Enable AI/ML applications

Artificial intelligence and machine learning (AI/ML) are growing in importance and popularity since computers are often able to respond to rapidly changing situations much more quickly and accurately than humans.

But AI/ML applications often require processing, analyzing and responding to enormous quantities of data which can’t reasonably be achieved with centralized processing due to network latency and bandwidth issues. Edge computing allows AI/ML applications to be deployed close to where data is collected so analytical results can be obtained in near real-time.

3 edge computing scenarios

Red Hat focuses on three general edge computing scenarios, although these often overlap in each unique edge implementation.

1. Enterprise edge

Enterprise edge scenarios feature an enterprise data store at the core, in a datacenter or as a cloud service. The enterprise edge allows organizations to extend their application services to remote locations.

Chain retailers are increasingly using an enterprise edge strategy to offer new services, improve in-store experiences and keep operations running smoothly. Individual stores aren’t equipped with large amounts of computing power, so it makes sense to centralize data storage while extending a uniform app environment out to each store.

2. Operations edge

Operations edge scenarios concern industrial edge devices, with significant involvement from operational technology (OT) teams. The operations edge is a place to gather, process and act on data on site.

Operations edge computing is helping some manufacturers harness artificial intelligence and machine learning (AI/ML) to solve operational and business efficiency issues through real-time analysis of data provided by Industrial Internet of Things (IIoT) sensors on the factory floor.

3. Provider edge

Provider edge scenarios involve both building out networks and offering services delivered with them, as in the case of a telecommunications company. The service provider edge supports reliability, low latency and high performance with computing environments close to customers and devices.

Service providers such as Verizon are updating their networks to be more efficient and reduce latency as 5G networks spread around the world. Many of these changes are invisible to mobile users, but allow providers to add more capacity quickly while reducing costs.

3 edge computing examples

Red Hat has worked with a number of organizations to develop edge computing solutions across a variety of industries, including healthcare, space and city management.

1. Healthcare

Clinical decision-making is being transformed through intelligent healthcare analytics enabled by edge computing. By processing real-time data from medical sensors and wearable devices, AI/ML systems are aiding in the early detection of a variety of conditions, such as sepsis and skin cancers.

Read more about edge computing in healthcare.

2. Space

NASA has begun adopting edge computing to process data close to where it’s generated in space rather than sending it back to Earth, which can take minutes to days to arrive.

As an example, mission specialists on the International Space Station (ISS) are studying microbial DNA. Transmitting that data to Earth for analysis would take weeks, so they’re experimenting with doing those analyses onboard the ISS, speeding “time to insight” from months to minutes.

Read more about edge computing in space.

3. Smart cities

City governments are beginning to experiment with edge computing as well, incorporating emerging technologies such as the Internet of Things (IoT) along with AI/ML to quickly identify and remediate problems impacting public safety, citizen satisfaction and environmental sustainability.

Read more about edge computing and smart cities.

Red Hat’s approach to edge computing

Of course, the many benefits of edge computing come with some additional complexity in terms of scale, interoperability and manageability.

Edge deployments often extend to a large number of locations that have minimal (or no) IT staff, or that vary in physical and environmental conditions. Edge stacks also often mix and match a combination of hardware and software elements from different vendors, and highly distributed edge architectures can become difficult to manage as infrastructure scales out to hundreds or even thousands of locations.

The Red Hat Edge portfolio addresses these challenges by helping organizations standardize on a modern hybrid cloud infrastructure, providing an interoperable, scalable and modern edge computing platform that combines the flexibility and extensibility of open source with the power of a rapidly growing partner ecosystem.

The Red Hat Edge portfolio includes:

The Red Hat Edge portfolio allows organizations to build and manage applications across hybrid, multi-cloud and edge locations, increasing app innovation, speeding up deployment and updating and improving overall DevSecOps efficiency.

Learn more