search
close-icon
Data Centres
PlatformDIGITAL®
Partners
Expertise & Resources
About
Language
Login
Talk to a specialist
banner
Article

Distributed Architectures: How Fog and Edge Computing Can Fit Into Your Cloud

September 19, 2019

As the number of IoT devices continues to increase – a predicted 75 billion by 2025 to be exact – so do the data requirements. Cisco estimates that IoT will generate more than 500 zettabytes per year in data by the end of 2019. In order to create an environment where IoT devices and applications are seamlessly connected to one another and their end users, sufficient computational and storage resources are needed to perform advanced analytics and machine learning which the cloud is capable of doing.

However, cloud servers are often located too far away from the IoT endpoints to be able to speedily and effectively transmit time-sensitive data to and from billions of “things” across large distances. And in scenarios like surgeons using robotic surgical devices to operate or driverless cars depending on ‘smart’ traffic lights – which are on track to become realities very soon with 5G – traditional cloud computing alone will not suffice since higher latencies are simply not an option.

That’s where edge and fog computing have come into play.

What is Edge Computing?

To understand fog computing, we first have to understand what edge computing is. Edge computing – commonly referred to as simply “the edge” – allows for data to be processed closer to its origination which can significantly reduce network latency. By physically bringing processing closer to the data source (like IoT devices), there’s less distance that data needs to be sent across, improving the speed and performance of devices and applications. However, there are limitations with things like real-time analysis and machine learning that can be achieved with fog computing.

What Is Fog Computing?

Fog computing – also known as fog networking or “fogging” - is a term that was created by Cisco in 2014 to signify decentralized computing architecture that acts as an extension of cloud computing. The storage and computing of data is distributed in the most logical and efficient way located between the cloud and the data source. Fog computing is seen as a complementary strategy for how edge computing can be effectively implemented while providing the compute, network, and storage capabilities of the cloud. It is estimated that the revenue produced by the fog computing market will increase by 55% between 2019 and 2026.

Fog Computing vs. Edge Computing

Edge and Fog computing are often used interchangeably. Edge computing is a more ubiquitous term and is often inclusive of the concepts behind fog computing as one cohesive strategy. But when broken down, fog computing was created to accompany edge strategies and serve as an additional architectural layer to provide enhanced processing capabilities that the edge alone cannot always do.

Fog cannot exist without edge computing, while the edge can exist without fog.

There are many similarities between fog computing and edge computing such as that they both bring processing closer to the data source. However, the main difference between the two is where the processing is taking place.

With fog computing, intelligence is pushed down to the local area network (LAN) level of network architecture. Data is processed in an IoT gateway or fog node.

With edge computing, intelligence is pushed directly into devices like programmable automation controllers (PACs).

Therefore, more advanced levels of processing, analytics and machine learning are possible in fog vs. the edge.

Fog Computing vs. Cloud Computing

While fog computing offers many of the same advantages as the cloud, the cloud has limitations such as being centralized and located further away from the data source, thus increasing latency and limiting bandwidth. It’s not always practical to transmit huge amounts of data all the way to the cloud and back again, especially for scenarios when processing & storage on a cloud scale is not necessary.

Cloud

Fog

Centralized

Decentralized

Higher latency

Low latency

Limited mobility

Supported mobility

Fog computing is great for more advanced yet short-term analytics at the edge, thus freeing up cloud resources for tasks associated with more massive data sets.

It’s important to note that Fog and Edge computing are not meant to replace centralized cloud computing but rather coexist in a cohesive IT strategy.

While fog computing offers many of the same advantages as the cloud, the cloud has limitations such as being centralized and located further away from the data source, thus increasing latency and limiting bandwidth. It’s not always practical to transmit huge amounts of data all the way to the cloud and back again, especially for scenarios when processing & storage on a cloud scale is not necessary.

Fog computing is great for more advanced yet short-term analytics at the edge, thus freeing up cloud resources for tasks associated with more massive data sets.

It’s important to note that Fog and Edge computing are not meant to replace centralized cloud computing but rather coexist in a cohesive IT strategy.

Why is it Called Fog Computing?

Fog, as a weather term, is a cloud that is close to the earth’s surface or ground. Fog computing conjures up the same imagery since it is close to the edge of the network and endpoints and provides cloud-like capabilities without being the cloud.

How Does Fog Computing Work?

Fog nodes descend from the cloud on a distributed level to the edge. Data is then transmitted from endpoints to a gateway and then transmitted back to the original sources to be processed.

Colocation Complements the Edge and Fog Computing

Digital transformation means something different to every business. Meeting these new transformation challenges is forcing businesses to reconcile new architectural paradigms. For example, a highly centralized architecture often proves to be problematic as there is less control over how you’re connecting to your network service providers and end users, ultimately causing inefficiencies in your IT strategy. But at the same time, solely relying on small, “near edge” data centers could become expensive, put constraints on capacity and processing workloads, and potentially create limitations on bandwidth.

A core multi-tenant data center provides a component of a better distributed architecture. It can be helpful to think of your IT infrastructure in terms of layers. The first layer consists of your enterprise core where things like your intellectual property, high-density computing, and machine learning can live. From there, you can continue to add layers such as cloud computing services, distributed multi-site colocation, and 5G aggregation as part of your edge delivery platform.

Through a multi-tier distributed architecture, you will gain control over adding capacity, network, compute, storage, and shortening distances between your workloads and end users, ultimately enhancing performance and promoting improved data exchange.

Fog computing does not replace the cloud but rather complements it.
Tags