Distributed Cloud Computing: Fog Forecast

Smart Solutions

Distributed Cloud Computing: Fog Forecast

Centralized, or cloud, computing has solved lots of problems but it has also created a few new ones of its own. By moving out to the edge and closer to the source of the data, computing systems will be much better suited to absorb the coming data tsunami and assist in bringing about the Tactile Internet of tomorrow.

by Bernd Schöne

Ever heard of Fog Computing? If not, you are in good company because most people haven’t the foggiest idea – either. Nonetheless, it is probably one of the hottest IT trends in recent years. The basic idea is simple: Move your centralized cloud structures closer to the data on the fringe of the network. This concept is called Edge Cloud and has been around for years. In June 2018 the OpenFog Consortium published the OpenFog Reference Architecture which was immediately adopted as an international guideline to meet the data-intensive requirements of the Internet of Things.

“We now have a … blueprint that will supercharge the development of new applications and business models,” says Helder Antunes, chairman of the OpenFog Consortium and senior director at Cisco, the network company that provided the reference model. “We are constantly wasting time and bandwidth by transferring all the data gathered by our IoT devices first to the cloud for processing and then moving the results back into the network.” It would be much more sensible, he believes, to do most of the processing on the network’s edge where the data resides. Of course, this calls for some pretty fancy routing – but intelligent routers are what Cisco does.

It also means that the networks themselves need to get smarter and a broad consortium of manufacturers, from Google to Apache and Dockers, are on board this push towards the edge. They are joined by mobility experts who have incorporated Fog into the blueprint for the emerging 5G wireless standards. Despite the wireless community’s interest, Fog is actually based on wired data processing but, here, the new motto is: “Less centralism, more distribution of tasks”. Central computer systems are prone to overloaded. That was one reason companies began switching to PCs in the 1980s, each operating as a central server on the department level and connected through a client-server architecture. Sometime around the mid-2000s the whole idea was turned around when virtualization of servers made it possible to centralize computing power once more, which saved a ton of money but didn’t really solve the bottleneck problem – in fact, it made it worse.

Distributed Cloud Computing - Typical Use Cases for Fog

Centralization has its obvious uses, of course. You get to store all your data at a convenient single location where you have it under control and can process it anytime you want. The price you pay is the time it takes to transport the data to the central server and then you wait until it’s your turn to crunch numbers. Afterwards, the data has to move all the way back to its place of origin.

In IT, the weaknesses of centralization are becoming more apparent as we enter the Age of IoT because the gap between the amounts of data and the processing power needed continues to widen. For instance, examine data outputs of a typical wind turbine, an airplane or a surveillance system. A wind turbine can boast 1,000 sensors or more, each capable of producing anywhere up to 60,000 individual bits of data per minute. The engine of a late-model Airbus A320 jet liner generates terabytes of data every hour from more than 24,000 measuring points, each producing millions of data sets that need to be downloaded after landing and sent to a data center for further processing. Many digital surveillance cameras already provide 4k resolution which means 10.2 gigabytes per second – and 8k cameras are just around the corner.

In each of these cases, operational efficiency does not demand that all data is moved to the center, what counts is the message the data conveys. For instance, the results of an analysis performed at the edge will surface to determine if unscheduled maintenance of a wind turbine is necessary or a jet engine is in danger of shutting down in mid-flight.

Fog deals with these issues by doing away with gridlock and thereby with pain points for the customer. It comes in many shapes and sizes, for instance Fog Computing, Fog Networking, Edge Computing or Edge Cloud. Perhaps a better term would be Distributed Cloud Computing.

Users could care less which name the techies stick on their systems, just as long as it works. But how to make thousands of devices, processors and storage points function seamlessly as a distributed network? This will require more powerful routers with new kinds of software and, possibly, new operating systems. A key component will be the Edge Controller, a kind of programmable logic controller (PLC) – a small computer with a built-in operating system optimized to handle incoming events in real time. These tiny processing units need to be based as close as possible to the sensors where they can be filtered more rapidly.

We are constantly wasting time and bandwidth by transferring all the data gathered by our IoT devices first to the cloud for processing and then moving the results back into the network.
Helder Antunes, chairman of the OpenFog Consortium and senior director at Cisco

It’s early days and much of the technology needed is still under development. To create a network of smart devices and local clouds, the system must be capable of finding the necessary programs and transporting the required data to them as quickly as possible. This is sure to require thousands, possibly millions, of small, modular data centers or even tiny Micro Modular Data Centers (MMDCs). The standards authority of ETSI, the regional standards body that was set up in 1988 by the European Conference of Postal and Telecommunications Administrations (CEPT) in response to proposals from the European Commission. Their goal is an open application framework called Multi-access Edge Computing (MEC) which will allow for the deployment of services such as radio-aware video optimization, which uses caching, buffering and real-time transcoding to reduce congestion of the cellular network and improve the user experience. Fog and the mobile world, it seems, are moving ever closer together.

A final issue Fog developers worry about is latency, the time taken by a unit of data (typically a frame or packet) to travel from its originating device to its intended destination which is usually measured in milliseconds. Delays of more than ten microseconds in a video transmission can be annoying for viewers and wearers of data glasses experience dizziness if latency exceeds five milliseconds. Machines are even more sensitive to signal delays, and anything below a millisecond can lead to shut down.

Since latency is a function of distance, Here, Fog and Distributed Cloud Computing have a distinct advantage. The fastest way to transmit data is through fiber optic cables. Light travels 300,000 km per second but fiber optic cables can only achieve 200,000 km per second due to the delay created by switches and routers. It seems that the laws of physics actually demand Edge Computing. To achieve a latency below one millisecond, controllers and sensors cannot be further from each other than 100 km.

Experts have high hopes for even faster applications with Distributed Cloud Computing. Below a millisecond we enter the realm of Tactile Computing which offers users the same immediacy as their sense of touch. This will be crucial in areas such as telemedicine or in piloting drones. Combined with haptic gloves to simulate physical sensations, these systems will open up new possibilities, for instance, allowing doctors to perform tricky operations on a patient who is halfway across the world, or to enable bomb squads to defuse explosive devices from a safe distance. The future, it seems, looks bright for Fog.


Protecting Data through Fog

The newly enacted European General Data Protection Regulation (GDPR) is expected to become a major driver for fog, or distributed cloud computing. Authorities in Europe are tasked with overseeing the use of personal information at the local level and non-compliance can lead to heavy fines. Edge computing could provide a foolproof way of making sure sensitive data is stored and processed within the territorial limits of the European Union, as stipulated by the new regulation.

Leave a Reply

Your email address will not be published. Required fields are marked *

*