Knowledge Hub

Covid-19 Impact: Why edge computing is gaining popularity?

Share on facebook
Share on twitter
Share on linkedin
Share on email
Share on whatsapp
cloud

The pandemic has compelled enterprises to rapidly move their critical workload to the cloud to ensure seamless functioning of business. As cloud is gaining momentum, and enterprises are frantically looking for ways to optimise their network, storage and agility, edge computing has turned out to be the perfect solution.

To understand where edge computing fits in the whole spectrum of IT infrastructure, we need to begin with the basics— Understanding what actually is edge computing. “Edge computing” is a type of distributed architecture in which data processing occurs close to the source of data, that is, at the “edge” of the system. This approach reduces the need to bounce data back and forth between the cloud and device while maintaining consistent performance. This reduces latency in data transmission and computation, thereby enhancing agility.

With regards to infrastructure, edge computing is a network of local micro data centers for storage and processing purposes. At the same time, the central data centre oversees the proceedings and gets valuable insights into the local data processing. However, we need to be mindful that edge computing is a kind of expansion of cloud computing architecture— an optimised solution for decentralised infrastructure.

The ultimate purpose of edge computing is to bring compute, storage, and network services closer to endpoints and end users to improve overall application performance. Based on this knowledge, IT architects must identify and document instances where edge computing can address existing network performance problems.

How does edge computing work?
In traditional enterprise computing, data is produced at a user’s computer. That data is moved across a WAN such as the internet, through the corporate LAN, where the data is stored and worked upon by an enterprise application. Results of that work are then conveyed back to the end-user. However, if we consider the number of devices that are connected to a company’s server, also the volume of data it generates, it is far too much for a traditional IT infrastructure to accommodate.
So, IT architects have shifted focus from the central data centre to the logical edge of the infrastructure—taking storage and computing resources from the data centre and moving those resources to the point where the data is generated.

There are several reasons for the growing adoption of edge computing:

  • Due to emerging technologies such as IoT and IoB, data is being generated in real-time.
  • Devices enabled by these technologies require a high response time and considerable bandwidth for proper operation.
  • Cloud computing is centralised. Transmitting and processing massive quantities of raw data puts a significant load on the network’s bandwidth.
  • Incessant movement of large quantities of data back and forth is beyond reasonable cost-effectiveness and leads to latency.
  • Processing data at the source and then sending valuable data to the centre is a more efficient solution.

As organisations move back to remote working models, we will witness wider adoption of edge computing as it empowers remote work infrastructure with greater computation and storage capabilities.

As written by Neelesh Kripalani, Chief Technology Officer at Clover Infotech, and published in Financial Express

Leave a comment

Your email address will not be published. Required fields are marked *

Popular Blogs
Related Blogs
Category Cloud

Subscribe to Our Blog

Stay updated with the latest trends in the field of IT

Before you go...

We have more for you! Get latest posts delivered straight to your inbox