IT infrastructure

What does the rise of edge computing mean for cybersecurity?

Edge computing brings computing power and data storage back from the cloud to local devices, but what does it mean for cybersecurity?

Art by

sodavekt

Share article

edge computing cybersecurity

Everyone in enterprise IT is talking about transitioning to the cloud. But now there’s a new paradigm on the block – edge computing. And it’s causing quite a stir in a world already facing a multitude of cyber-threats. So, what exactly is edge computing, and what does it mean for information security?

First, a short history of business computing. Until the 1980s, businesses had barebones computers, also known as dumb terminals, connected to a mainframe, which would take care of computing workloads and data storage. Then along came personal computers, where the machine on your desk handled the computing workloads. Today, in-house servers and workstations are falling out of favor in business environments as decentralized computing takes over. Many businesses have migrated to the cloud to allow users to access centralized cloud computing services hosted by companies like Amazon and Google. In many ways, it’s a lot like the mainframes of the old days, albeit on a bigger scale, thanks to the internet rendering geographical boundaries irrelevant, mostly.

Today, ‘dumb’ terminals have made a comeback in the form of smartphones, tablets and other internet-connected devices, in which many computing workloads are handled in the cloud rather than the device itself. But at the same time, these devices are now orders of magnitude more powerful than the workstations of just a couple of decades ago. They’re now capable of doing their own work.

And now we have a new option. Edge computing brings those computing workloads and data storage back to local devices, much like when we first started using personal computers. But there are two fundamental differences: internet-enabled devices are more numerous and diverse than ever before and most of them are permanently connected to the internet.

Edge computing represents a fusion of cloud and local computing in which the cloud is still retained for carrying and, in many cases, storing data while local internet-connected devices take care of the data processing. By 2022, some three-quarters of enterprise data will be processed outside the cloud, so it’s safe to say that it’s more than just a passing trend.

Why are companies moving to edge computing?

The most immediate advantage of edge computing is speed. With cloud computing, data is often transferred over hundreds, or even thousands, of miles between the local device and the remote data center. This means that the effective processing speed of a cloud-based app is heavily dependent on the distance instructions have to travel to control it. The delay is known as latency, measured in milliseconds (ms). A latency of 20-40ms is about the best you can expect with today’s cloud-hosted applications. But the more significant issue is delays caused by problems with interference or outdated protocols when the data processing takes place remotely. Response times will always be faster with local computing, not just because the laws of physics say so, but also because there’s less scope for interference and rerouting issues.

Although the performance difference is barely discernible in most business applications, other computing workloads aren’t ideally suited to the cloud and likely never will be because of the inherent performance and latency constraints. These include bandwidth-intensive applications like real-time 3D rendering and the synchronization of massive amounts of data with online storage. Even with a 1GB internet connection, you can end up running into bandwidth problems. And that’s before we take into consideration other factors like bandwidth hogs and interference. Let’s not even get started on the painfully stingy data limits of many mobile providers.

Internet speeds might be increasing all the time, but so are computing demands. Many cloud software providers are working hard to reduce bandwidth consumption and problems with high latency. Having to be online all the time is another source of frustration, particularly when it comes to computing on the move where connections might be intermittent. Fortunately, things are changing. For example, progressive web apps (PWAs) often provide a basic level of offline functionality. Google’s Chrome OS, one of the most cloud-centric platforms of all time, now let’s Chromebook users work offline.

Self-driving cars – the best example of computing on the edge

Perhaps the ultimate example of edge computing is self-driving cars. Given the importance of quick reaction times in a moving vehicle, there’s no scope for delays due to latency or intermittent service outages. The numerous sensors that make up the system have to feed data, in real-time, into an onboard computer for local processing. At the same time, the responsibility to keep people safe shifts towards the company that makes the software, which makes centralized management a practical necessity. That’s why self-driving cars need to be hooked up to the internet to receive critical updates and continuously feed data into the cloud for developers to improve the algorithm continually.

Does edge computing mean regaining or losing control over enterprise data?

edge computing and cybersecurity
Having data processed locally while retaining the cloud as a way to transmit and store data potentially gives businesses more control over their customers’ lives. While we may no longer have to worry about things like updates and maintenance with remotely hosted and managed systems, letting third parties decide which features our devices need places us in a precarious situation for privacy and security. The stakes are even higher for businesses, which need to ensure they meet the demands of compliance and customers’ expectations to keep their data safe. While legislation like the EU’s GDPR aims to return control to the user, cybercriminals won’t pay too much attention to regulatory compliance.

Losing control is one of the most common fears business leaders have about migrating to the cloud in the first place. The combination of cloud and edge computing introduces fresh concerns that businesses will end up surrendering control over their connected devices to third parties and potentially putting customer data at risk. Also, hackers who gain access to the device via the cloud may be able to steal the data stored on them. This is already manifesting itself in the consumer world, with platforms like Apple’s Siri reportedly recording people’s conversations.

With IoT (internet of things) leading to a massive uptake of smart internet-connected devices, ranging from in-store beacons to remote-controlled HVAC (heating, ventilation and air conditioning) systems, the risk continues to grow.

Whether your data is stored locally or on the cloud, if hackers access your credentials, the risk of a breach is equal. But additionally, edge computing expands the potential attack surface by having sensitive data stored and processed across a more extensive array of systems. It gets much harder, even to the point of becoming a practical impossibility, to protect ubiquitous computing environments at scale, simply because the footprint grows too large. A recent study by Tech Republic found that two-thirds of IT teams considered edge computing as more of a threat than an opportunity, mainly because of this dramatically increased endpoint attack surface.

If someone has physical access to a device, that device is no longer secure. Losing control of devices empowered with edge computing technology can expose vastly more customer data and intellectual property than losing control of other types of devices.

The Top Five Emerging Technologies Security Leaders Need To Prepare For, 2018

Forrester

The stakes are high with edge devices, and even more so with consumer-grade technology. To start with, businesses must be mindful about which suppliers they choose – consumer-grade tech is often a no-go for a start, which is a challenge for smaller firms where one in four use it for their data security. They also need to stop putting total trust in basic endpoint security measures like passwords and perimeter defenses. Business leaders need to expand security capabilities to all edge devices. This includes encryption of data both at rest and in transit, changing default passwords and maintaining control through a centralized management dashboard that governs how devices interact with the computing environment.

Do you need the cloud for everything?

The terms edge computing and the internet of things are often used interchangeably, though they’re not the same thing. An edge computing device doesn’t necessarily have to be connected to the internet at all times, or even at all. Instead, it might be connected only to an internal network, which could be used in industrial or other business environments. By definition, these devices are meant to retain some or even all functionality offline. That means companies can reduce risks by disallowing direct connections between edge devices and the cloud unless they’re necessary for performing critical functions. Then, the data should be protected behind multiple layers of security such as encryption and multi-factor authentication.

To that end, done right, edge computing can be beneficial, rather than detrimental, to security.

Article reflects the opinions of the author. Published in 2019.

Protect your business

What does the internet of things mean for your business security? Avoid threats and innovate without adding risk.

About authors

Produced by the editorial team for Secure Futures by Kaspersky magazine