WTFog Computing

That Cloud computing is popular, is rather safe to say. It enables our smart phones to be really smart. Actually, every object in our vicinity is getting smarter and smarter. It won’t take long before our smart homes become part of a smart city, in which we’ll be driving smart cars on smart roads.

Smart devices are constantly transmitting and receiving data to provide us with the services we want. And while hosting computer power on a remote server is really handy, the evolution of the ‘Internet of Things’ into the ‘Internet of Everything’ brings arising problems and demands for a new set of rules.

The problem with this cloud setup lay with bandwidth and latency. And as the army of smart devices continues to expand, with more and more objects connecting wirelessly to transmit and receive data, the problem is only going to be increasing exponentially.

The more data we create and process the more important it becomes to look at the nature of the data. For example, if your smart boiler gathers data to predict when you need maintenance of the boiler, then there is no need to send all the data to the cloud.

Another example, a Tesla car has more than 50 built-in sensors. These gather roughly 1 Terabyte of data on a yearly basis. Most of this data is handled locally, in the car, simply because it is not useful and is very inefficient to send to the cloud. And the split second that takes a smart car to make a decision also disqualifies the cloud as a main solution in this case. This of course calls for local processing power, but most modern cars have more powerful processors onboard than an average PC from 10 years ago.


The more data we create and process the more important it becomes to look at the nature of the data. 

In addition, having all smart devices connecting to and sending raw data to the cloud over the internet can bring privacy, security and legal implications, especially when dealing with sensitive data, subject various regulations in across countries.

 

In response to all this, introducing the new buzzword of the year: Fog computing.

Fog computing, solves the above-mentioned problems by keeping data closer to the ground, in local computers and devices, rather than sending everything to the cloud. It allows for data to be:

  • processed and accessed more rapidly, 
  • accessed more efficiently, and 
  • processed and accessed more reliably 

From the most logical location, and as a result to reduce the risk of data latency.

Hardware manufacturers, such as Cisco, Dell and Intel, are already working with IoT (Internet of Things) and machine learning vendors to create IoT gateways and routers that support fogging.

 

Fog does not deprecate Cloud computing. Instead, the two technologies go hand in hand.

 

So what are some real life examples of Fog computing?

One application of Fog computing is in hospitals, used when closely monitoring patients. Here, real-time data analysis is crucial for monitoring the patient's heart and movements during surgery. In such life-threatening situations motioning smart devices that are based on cloud are not suitable, as decisions need to be made in a split second.

Similarly, in the manufacturing industry, where disruptions have to be kept to a minimum, Cloud computing is not a reliable solution as decisions need to be taken real-time to prevent production loss. Here, Fog computing can complement and enhance Cloud, providing locally available computing power. 

Fog computing - scenario’s

Fog does not deprecate Cloud computing. Instead, the two technologies go hand in hand. Where it is more useful – the data will be locally analyzed and stored, and where centralized storage and analysis are more applicable, the Cloud will be put to good use. 

Eventually, the line between Fog and Cloud computing will fade out, leaving us with a foggy IT landscape.

 

Craving more tech insights? Subscribe to our 2SQRS TechTalks Newsletter.

 

Note: this article was originally posted on LinkedIn.com

 

Share it by