Have an account?

  •   Personalized content
  •   Your products and support

Need an account?

Create an account

What Is Edge Computing?

The edge computing model shifts computing resources from central data centers and clouds closer to devices. The goal is to support new applications with lower latency requirements while processing data more efficiently to save network cost. An example use case is Internet of Things (IoT), whereby billions of devices deployed each year can produce lots of data. When data is processed at the edge instead of the cloud, backhaul cost is reduced.

Edge computing revenue opportunities

Emerging ecosystem

Edge computing is an emerging ecosystem of resources, applications, and use cases, including 5G and IoT.

It will continue to enable many new use cases and open up opportunities for telecom providers to develop new services that reach more people. Immediate revenue models include any that benefit from greater data speed and computational power near the user.


Location of the edge

Reduced latency and costs are key characteristics of edge computing. The edge is the location nearer the subscriber and where data is processed or stored without being backhauled to a central location. It is dependent on a specific service or application to optimize for cost, performance, and user experience.


The difference between edge and fog computing

Fog computing is a term created by Cisco in 2014 describing the decentralization of computing infrastructure, or bringing the cloud to the ground.

Fogging enables repeatable structures in the edge computing concept so that enterprises can easily push compute power away from their centralized systems or clouds to improve scalability and performance.


Vendor's view of edge

This diagram depicts where the edge is located from various vendors' view. However, a clear distinction needs to be made between devices with computer power and edge computing serving many devices simultaneously.

Provider's view of edge

From a service provider's perspective, as shown in the diagram, edge computing is a continuum from the enterprise edge through the service provider's infrastructure to the public cloud. In business terms, edge computing is best located where the applications or services are optimized.

Benefits of edge computing

Edge computing is part of a larger ecosystem with potentially unseen benefits.  Some immediate benefits include:

Reduced latency and increased speed

Placing compute power near the edge saves time. Milliseconds make a difference for many applications.


Security

Data is analyzed locally and protected by the security blanket of an on-premises network or the closed system of a service provider.


Cost savings

Companies can optimize the flow of data into central systems and retain the bulk of raw data at the edge where it is useful. Bandwidth and costs are reduced.


Remote reliability

Edge devices locally store and process data and work with edge data centers to overcome any intermittent connectivity issues.


Rapid scalability

Installing edge data centers and IoT devices can allow businesses to rapidly scale their operations.

Types of edge computing technology

Fog computing

Fog computing refers to decentralizing a computing infrastructure by extending the cloud through the placement of nodes strategically between the cloud and edge devices.

This puts data, compute, storage, and applications nearer to the user or IoT device where the data needs processing, thus creating a fog outside the centralized cloud and reducing the data transfer times necessary to process data.

Multi-access Edge Computing (MEC)

Per the European Telecommunications Standards Institute (ETSI) definition: MEC offers application developers and content providers cloud-computing capabilities and an IT service environment at the edge of the network. 

This environment is characterized by ultra-low latency and high bandwidth as well as real-time access to radio network information that can be leveraged by applications.

Micro data centers

Micro data centers are highly mobile and rugged. They provide the same components as traditional data centers but can be deployed locally near the data source. 

Highly flexible micro data centers can be custom built and configured to suit the implementation requirements of unique situations. This flexibility allows data centers to be rapidly deployed to underserved areas or disaster centers, for example.

Cloudlets

Modeled after clouds, cloudlets are mobility enhanced small-scale data centers placed in close proximity to edge devices so they can offload processes onto the cloudlet. They are particularly designed to improve resource-intensive and interactive mobile apps through the extra availability of low-latency computing resources.

Emergency response units

These mobile, self-contained units establish interoperable communications for first responders in emergency situations. They can be deployed rapidly to any crisis site, along with a highly skilled Tactical Operations team, to re-establish communications for the affected areas.