September 25, 2020

Edge computing and 5G give business apps a boost

Over the past decade, advances in cloud computing have driven a centralized approach to system administration and operations, while the growth of mobile computing, SaaS, and the internet of things (IoT) have driven computing toward a distributed architecture. With the rollout of 5G and edge computing technologies, companies are now looking to take advantage of both approaches while boosting performance for their applications.

While much of the hype around 5G and edge tend to focus on innovative, cutting-edge applications in areas such as robotics, augmented or virtual reality (AR/VR), and autonomous vehicles, experts say the benefits of edge computing go beyond these apps to provide IT professionals an array of opportunities.

How edge computing tackles latency

Enterprises have benefited from cloud computing during the past decade by centralizing resources at data centers owned by cloud providers — saving money on management costs and avoiding capital expenditures needed for internal data centers. But centralization has led to performance issues when dealing with endpoints on the internet’s “edge,” such as IoT devices/sensors and mobile devices.

While today’s smartphones essentially intelligent computers that fit in your pocket, they still require a vast amount of processing done in the cloud. “Why can’t you put all the intelligence at the end? In other words, why can’t your smartphone just do it?” asked Mahadev Satyanarayanan, a professor of computer science at Carnegie Mellon University.

“The answer is to do the kind of compute that you want done, you need far more computing resources than you would carry with you on your smartphone,” he said. “If you think about the video camera on your smartphone, it’s extremely light. But if you wanted to do real-time video analytics on it, you couldn’t do it with the computer on the phone today — you would ship [the data] to the cloud, and that’s where the problem begins.”

The solution, as outlined in an influential 2009 IEEE Pervasive Computing article co-authored by Satyanarayanan, is to use virtual machine-based “cloudlets” in mobile computing — in other words, placing mini data centers at the network’s edge close to where their processing power is needed.

On average, Satyanarayanan explained, the round-trip time between a smartphone and cell tower is about 12 to 15 milliseconds over a 4G LTE network, and can be longer depending on legacy systems and other factors. However, when you ping the data center from your smartphone, this could take anywhere between 100 milliseconds to 500 milliseconds, even up to a full second in some cases. Satyanarayanan calls this lag the “tail of distribution,” which is problematic for low-latency applications.

“Human users in applications like augmented reality are extremely sensitive to the tail,” Satyanarayanan said. “If I give you half an hour of an augmented reality experience, you may have 25 minutes of a superb experience. But what you will remember is five minutes of a horrible experience.”

Copyright © 2020 IDG Communications, Inc.

Source Article