[Term of the Day]: Edge Computing

[Term of the Day]: Edge Computing

Term of the Day 
 

Edge Computing 

 

Definition — What is Edge Computing? 


Edge Computing is an emerging technology that is currently receiving vast amounts of attention in the IT sector. This technology is the next evolution of Cloud Computing, the new technology will not obsolete cloud computing, but will transform it and take advantage of its many capabilities. Edge computing focuses on distributed computing technique that allows bringing computation of data and storage too close to the source.

 

The traditional cloud computing networks are highly centralized and it is being widely used across the world for more than a decade. There are two issues with the cloud computing network. First, it takes time for data to travel from the edge device back to the center for processing. This delay might only be a matter of milliseconds, but it can be critical. Secondly, all that data traveling back and forth between the edge and the center of the network puts tremendous strain on bandwidth. This combination of distance and high volume traffic can slow the network down to a crawl. Edge computing technique helps to solve the challenges associated with latency and inefficiency in data transfer. The main idea behind edge computing is the management of data at the point where it’s generated, rather than relying on upload to a centralized resource where data has traditionally been processed. As the Internet of Things grows, edge computing is increasingly critical to create efficiencies in the gathering, processing, and routing of data. According to Gartner 91% of today’s data is created and processed in centralized data centers. By 2022 about 75% of all data will need analysis and action at the edge.


                New to ADManager Plus?

                  New to ADSelfService Plus?