According to IDC, the edge market is expected to reach $251 billion by 2025, as industries including transportation, healthcare, energy, materials, and retail rank the Internet of Things (IoT), edge computing, or both as a priority emerging tech workload.
Those industries are already using edge architecture for things like monitoring transportation assets for predictive maintenance to reduce downtime. They also use remote medical monitoring to help meet increased demand for continuous care as life expectancies increase. Delivery presales analytics are used for more precise customer targeting and route optimization for improved tracking and utilization of shipping containers.
Edge technology is gaining traction in the enterprise for accelerating agility and innovation by reducing latency and supporting real-time decision-making. In plain English, edge computing places computer, networking, and storage resources closer to where they'll be used.
If you work in technology, this definition is most likely familiar to you. But what does edge computing mean in practice? To answer this question, I turned to the Red Hat Learning Community, a collaborative, global network of technologists working on achieving certifications. Learning-community members can get one-on-one support and receive answers directly from their peers.
[ Working at the edge? Learn more about validated patterns. ]
I reached out to the community in search of ways to define edge computing and some examples of how organizations are using it today. Their responses have been edited for grammar, clarity, and conciseness. My comments are in italics.
What is edge computing?
Edge is not a technology as much as it's a way to use technology. Edge computing is a distributed architecture that, by bringing data closer to its source or end user, opens the door for many other different technologies to thrive. This includes IoT, 5G, autonomous vehicles, gaming, and virtual reality.
Josip Stanesic: The easiest way to describe edge computing is it brings servers and processing power closer to customers for them to have a better experience.
Saif Bassim: Edge computing refers to a range of networks and devices at or near the user. Edge is about processing data closer to where it's being generated, enabling processing at greater speeds and volumes, leading to greater action-led results in real time.
Jashandeep Singh: In simple terms, I would say edge computing is about processing data closer to where it's being generated, hence enabling processing at greater speeds and volumes.
[ Check out Red Hat's Portfolio Architecture Center for a wide variety of reference architectures you can use. ]
Edge computing differs from traditional computing
Traditional computing is built around a centralized datacenter. Everyday internet isn't well suited to moving huge chunks of data due to bandwidth limitations, latency issues, and unpredictable network disruptions. Organizations are adopting edge computing to overcome these issues.
Saif Bassim: Edge computing offers some unique advantages over traditional models where computing power is centralized at an on-premises datacenter. Putting computing at the edge allows companies to improve how they manage and use physical assets and create new interactive, human experiences. Some examples of edge use cases include self-driving cars, autonomous robots, smart equipment data, and automated retail. However, before shifting something to the edge, think about how you'll handle storage costs, potential data loss, and security as your data is disrupted on multiple sites.
There are different types of "edge"
There's more to the edge than how data is processed—there are different types of edges.
Trevor Chandler: When I see "edge" all by its lonesome with no other context, I automatically think about the network edge. Maybe that's because I'm a lot more connected to the networking side of IT. But the network edge and edge computing are different.
In my words, the network edge refers to the endpoints or devices that provide services to those at the edge.
Rob Goelz: Edge, to me, means the edge of the network. This may mean just inside the network, in the DMZ, or some other portion of the network. To me, this means a section of the network infrastructure before the internal or private network. This allows for computing operations to happen closer to the ingress of network data, requiring "less hops" to get to the computing required to act upon the data.
In plain English, "hops" refers to edge hopping, or how far away the nodes sit from the client. When you start talking about edge infrastructure, you may hear about local area network (LAN) or wide area network (WAN). Whether or not your devices are on a LAN or WAN depends on where the data is coming from—different workstations in a building versus sharing data between a corporate office and a regional office, for instance.
What comprises the edge is changing quickly
Edge brings data to life by bringing it closer to where the action happens to reduce latency between when an event happens and when a response occurs. The need for edge is rising due to the growth of modern applications such as IoT, autonomous vehicle systems, telecom, and mobile edge computing, which all require minimal latency.
Theophanis Kontogiannis: At the pace CPUs change in philosophy, design, and production, soon even our mobile phones will be doing hardcore edge processing.
Darren Lavery: We have an invisible world of technology, and it's a bit of a mess with AI, analytics, silos, different datacenters, and networking meshes. I think edge is the beginning of bringing it all together; creating the core of a world where edge technologies can plug into this core of data, centralizing things and creating a unique oneness in the near future.
Kubernetes is a good way to take advantage of the edge
Kubernetes is an excellent strategy for handling edge computing. It was built for working across multi-region datacenters to multiple edge locations, just like edge can.
Tharun Polimera: KubeEdge is an open source framework and provides a containerized edge computing platform, which is inherently scalable. As it is modular and optimized, it is lightweight and can be deployed on low-resource devices.
Here are a few of the most common ways to take advantage of a Kubernetes edge architecture:
- Managing clusters at edge locations: While cost could be something to consider, this takes a bit of the legwork out of deployment
- Putting the "node" bees to work: If you are strapped for resources and have a single Kubernetes cluster in your datacenter, you can use processing nodes to help facilitate deployment at your various edge locations
- Edge devices: Low-resource edge devices support the processing nodes by moving the data through the Kubernetes edge nodes for processing.
Kubernetes architecture not only makes edge deployments easier for specific use cases (such as 5G telecom industries and IoT operations), but it can also accelerate the automation of mission-critical workloads and reduce the overall cost of 5G deployment.
If you're into systems design, you're probably already pretty knowledgeable about the edge. But how versed are your colleagues about edge computing? They're some of the people who could help secure edge buy-in among your stakeholders. Join the Red Hat Learning Community to share your thoughts on the topic or to get answers to other technical questions.