Skip to main content

Cloud computing vs. edge computing

Cloud computing vs. edge computing
(Image credit: Panumas Nikhomkhai / Pexels)

Cloud computing isn’t exactly a new concept. Its benefits are well known in the business world and without it we wouldn’t have a large number of services that companies of all sizes rely on today. It’s no surprise then that 85 percent of companies believe cloud adoption is necessary for innovation. However, it is during the current Covid-19 crisis that cloud has really come into its own, enabling millions of companies around the world to continue to operate whilst almost all their workforce logs on from home.

About the author

Nick Offin, Head of Sales, Marketing and Operations, dynabook Northern Europe.

Newer concepts like edge computing are regularly discussed alongside the cloud, often as if they are each exclusive approaches to infrastructure. However, using one does not eliminate the ability to use the other. Some people also believe that edge computing will eventually replace traditional cloud computing, however, this isn’t the case. Both technologies have important and distinguishable roles within an IT ecosystem.

That being said, there are use cases where edge computing has advantages over traditional centralized cloud infrastructure, especially during this unprecedented increase in remote working, including overcoming latency issues, operational strain and security. So, what do these advantages look like?

Reducing operational strain

When comparing traditional cloud computing and edge computing, the main difference is how and where data processing takes place. With cloud, data is stored and processed in a central location (usually a data center), whereas edge computing refers to data processing nearer the source.

We already live in a data-rich world with the proliferation of new technologies such as IoT, 5G, wearables and assisted reality (AR) creating vast amounts of data that is generated close to the user or at the edge of the network. Remote working only adds to this as more and more devices try to access company networks outside of the central locations like offices. The cloud itself has significant compute and online storage capabilities, however, with such strain on network bandwidth, it requires a different type of infrastructure – this is where edge computing comes in.

Completely overhauling IT infrastructure to cope with this demand can be expensive and resource heavy for businesses. With edge computing, you don’t need to “rip and replace” infrastructure. Edge computing allows companies to resolve this challenge as processing data at the edge reduces strain on the cloud. In conjunction with edge data centers, edge computing can tackle more localized data processing, freeing up the cloud for more general-purpose business needs, and helping business applications perform faster.

Advantage two: latency

With the nature of the cloud, information is relayed back to the data center, processed, and then sent back to the edge of the network where the device is. This can take time for data to travel back and forth and can cause lag or latency. In many use cases, where the need to process data is not time-efficient, the cloud offers lots of processing power, storage, and large-scale data analysis. However, in some cases such latency can cause challenges for remote workers. For example, during the Covid-19 crisis employees have heavily depended on video conferencing and this relies on real-time connectivity.

The former example of an office worker is perhaps not mission-critical, however, network-related issue latency can have a more detrimental effect for a different kind of remote worker – those working as a frontline or field worker. For example, imagine a staff member working in a warehouse using the “pick-by-vision” setting on AR smart glasses they are wearing to assist them with manual order picking, sorting, inventory management, goods receipts and removal processes. 

If latency occurs during this process, and the worker receives delayed information, this can significantly impede their productivity and even cause ongoing fulfillment errors, which will inevitably affect a company’s bottom line. Edge computing offers a solution to this by relocating data processing closer to the device at the edge of the network, eliminating latency and therefore reducing incidences of network lag related failure.

Advantage three: security and privacy

As more people are working away from the office there is an increase in data being accessed remotely. Increased incidences of remote access give cyber criminals a greater opportunity to access company data and misuse the information within. With edge computing, data is filtered and processed locally, rather than sending it to a central data center, before being sent to the organisation’s network core via the cloud. If there is less transfer of sensitive data between devices and the cloud, this means better security for business and their customers.

COVID-19 has no doubt altered the working landscape which had meant business leaders had to rethink their remote working strategies. During this period, the cloud has allowed for data to be shared across organisations securely. However, as discussed, there are instances where edge computing can help to ease bandwidth, increase network speeds, and combat security concerns. 

Choosing edge or cloud computing isn’t an either/or proposition, both technologies have different purposes and uses and will continue to have important roles for the foreseeable future. As remote working becomes the new norm for businesses, it is predictable that the future network infrastructure will combine the two.