Edge Computing - Explained3rd Jan 2020
Everyone’s talking about the Edge, but what is it?
The Edge, it’s the chosen buzz word of the moment for some of the biggest tech companies in the world, taking off and becoming known as the next big thing since the cloud. It can all seem a bit overwhelming to get to grips with, but it’s really a much simpler concept to understand than you might think, when you break it down.
Edge computing is going to enable IoT devices to work to their biggest potential, alongside other technologies such as 5G and Wi-Fi 6, they are the infrastructure that is necessary for the advanced devices we will see starting to emerge on the market.
Personal Computing to Cloud Computing
In the beginning, there was personal computing. Everything was processed on your personal machine, and machines were hardwired together in order to communicate. Then, not so long ago, the buzz word in IT was ‘cloud’. Everyone was obsessed with cloud computing, and the enterprise potential it opened up, the idea of a centralised repository for processing and data offered huge potential.
IT businesses began the rush to move everything to the cloud, it enabled huge centralised data stores that allowed for some of the biggest digital companies to come to fruition. Facebook, Twitter, Google, Spotify… they wouldn’t function without the cloud. They rely on masses of data, which can’t be stored locally. Devices such as smartphones, laptops and tablets are small, and though their processing power is continuously growing, they still rely on having relatively small storage and RAM. Storing data in the cloud solves this problem.
The cloud isn’t going out of date any time soon, but there are some technologies that are being developed that cannot rely on the cloud entirely. These machines and devices, require fast, secure processing of data. They need to make instant decisions, and not making these decisions quickly could have dire consequences.
IoT and the Edge
These IoT devices include healthcare devices, financial computers and autonomous machines, just to name a few, and they require edge computing. This is when the majority of computer processing is conducted at the edge of the network - i.e. within the device itself.
The best example we can give you is the self-driving car. An autonomous vehicle has to utilise edge computing and AI learning to make real-time decisions about what it is presented with. If a pedestrian suddenly runs into the road, a self-driving car doesn’t have time to communicate with the cloud to find out what it should do, it has to know immediately or risk causing injury.
The cloud is limited by the speed of light, which certainly sounds quick, but seconds of delay when communicating with servers around the world aren’t ideal for every type of technology we have in mind. Just think about how infuriating it is when you shoot someone in an online game, and the lag doesn’t register the hit. We can’t have that lag when it comes to some new IoT devices.
To enable this technology, we are developing computer processors capable of making faster and faster calculations; in 2019 Google claimed it cracked quantum computing (but IBM declared it to be a farce.) and in the very near future we should be able to develop technology that has the same processing power as the human brain. Once we’ve done that, we can go even further.
When it comes to AI, we’re not at the stage where technology is capable of making flawless human decisions, but the strides we are taking towards it enables a whole host of advanced IoT edge computing tech.
Great for Big Tech
Edge computing is something we’ll see the big tech giants investing huge amounts in, if more computing can be done at the edge, it’ll save on server space and processing costs, as well as saving them from frustrated customers who can’t cope with speed lags.
Edge computing is even the answer to most IoT security concerns. Rather than relying on the security of a hackable cloud, devices will use edge computing to encrypt and secure data at the source, meaning it passes through as few devices as possible.
Of course, all of this means the technology of edge computing needs to be evergreen, hence why it needs to be intelligent. You can’t have your autonomous vehicle making mistakes just because it needs a major update. Of course there will be updates, but they’ll be smaller and more dynamic, and most importantly automatic, keeping your tech on the latest micro-version as soon as it can possibly connect. There won’t be major version updates on the scale of Windows 95, Vista and Windows 10. The average user won’t be able to experience any major difference between one version and the next, as it will happen so incrementally.
Eventually, big tech companies want to create software where there is no installing, no manual input to approve updates, they just want it to be synchronous with your life. Of course this is where questions of morality and personal data security come into play, but it’s worth noting this futuristic, world-changing technology is simply not possible unless we trust the big corporations with our data. That’s why we’ll likely start to see more and more of this discussed at a government level and regulations enshrined in law.
Computing at the Edge is also a benefit to speed in terms of how it can handle your bandwidth. If your smart device knows to process data and sort the data by importance, dropping anything non essential, it makes it easier to stream a smaller, more curated version of the data on the cloud or on your network.
Think about how Google now lets you view some websites offline. It’s learned which websites you, and the rest of the web frequent most, and it saves cached copies of these sites locally, so you can access them quickly or even without an internet connection. This is the type of edge computing that will allow you to use all your favourite apps without being online too.
So what do you think? Are you ready to relinquish control to the IoT? Will technology really change humanity for the better, or will it be for the worse? We’ll certainly find out in this decade.