Cloud Computing soon to take back seat to Edge Computing
The majority of processing will take place at the device level.
As devices like drones, autonomous cars and robots proliferate, they are going to require extremely rapid processing — so fast, in fact, that sending data up to the cloud and back to get an answer will simply be too slow.
When you consider that it’s taken the better part of a decade for most companies to warm to the idea of going to the cloud, it may sound crazy to some that we are already about to supplant it and move onto the next paradigm.
That’s not to say that the cloud won’t continue to have a key place in the computing ecosystem. It will. But, its role is about to change fairly dramatically, where it will be processing data for machine learning purposes, acting as an adjunct to more immediate data processing needs.
Many companies are beginning to recognize this, and we could be about to witness a massive computing shift just as we’ve begun to get used to the previous one.
I feel like we’ve been here before
If the idea of processing data at the edge sounds familiar, it should. Computing has gone in massive cycles, shifting from centralized to distributed and back again, and the coming move to the edge is just another manifestation of that.
It only makes sense that the next trend will swing back to a distributed system driven by the sheer volume of Internet of Things devices. When the number of devices on the planet is no longer limited by the number of humans, it has the potential to raise the number of computers in the world by an order of magnitude, and that will force a change in the way we think about computing in the future.
We are at the very beginning of this change, as we start to see the development of autonomous cars and drones, this will eventually lead to the ongoing proliferation of an abundance of smart devices — and it’s going to happen quickly.
Processing massive amounts of data
"Think about a self-driving car, it’s effectively a data center on wheels, and a drone is a data center with wings and a robot is a data center with arms and legs and a [ship] is a floating data center…" "These devices are processing vast amounts of information and that information needs to be processed in real time". This means that even the split-second latency required to pass information between these systems and the cloud simply takes too long.
"Think about a self-driving car, it’s effectively a data center on wheels". - Peter Levine (GP at VC firm Andreessen Horowitz)
I feel the need for speed
Deepu Talla, VP and GM at Nvidia, the company that’s making GPU chips that are helping fuel AI and robotics, says there are a number of reasons companies move to the edge, but it starts with a need for speed and pure practicality.
Talla says it’s not just big machines that Merfeld and Levine are talking about. For some Internet of Things devices, like connected video cameras, it also ceases to be practical to send the data to the cloud just because of the pure volume involved.
As an example, he points out that there are already a half a billion connected cameras in place today with a billion expected to be deployed worldwide by 2020. As he says, once you get over 1080p quality, it really ceases to make sense to send the video to the cloud for processing, at least initially, especially if you are using the cameras in a sensitive security zone like an airport where you need to make decisions fast if there is an issue.
Then there’s latency. Talla echoes Levine’s thinking here, saying machines like self-driving cars and industrial robots need decisions in fractions of seconds, and there just isn’t time to send the data to the cloud and back.
He adds that sometimes there are privacy issues where data could be considered too sensitive to send to the cloud and might remain on the device. Finally, companies may want to keep data at the edge because of a lack of bandwidth. If you are dealing with a location where you can’t stream data, that would mean having to process it at the edge. There wouldn’t be a choice.
AWS and Microsoft have noticed
AWS and Microsoft are always looking for what’s coming next, so it shouldn’t come as a surprise that the biggest public cloud providers have some products aimed toward the edge market already. For AWS, it’s a product called Greengrass, which is providing a set of compute services directly on IoT devices when public cloud resources aren’t available for whatever reason.
For Microsoft, it’s Azure Stack, which offers a set of public cloud services inside a data center, giving a customer public cloud-like resources at the data center level without having to move it back and forth from the public cloud.
It’s only a matter of time before we see other vendors and whole new companies begin to offer their own take on edge computing
What does it all mean?
In fact, if this change happens, it’s going to have a profound impact on computing as we know it. It will require new ways of programming, securing and storing data, and will change how we think about machine learning. "Every area of the compute stack gets upended as we see distributed computing come back", Levine said. That would represent a tremendous opportunity for both startups and VCs — especially those that get in early.
And just as we saw companies ahead of the cloud and mobile curve a decade ago, Levine says he is starting to see companies planting seeds in this area.
As we’ve seen, no form of computing ever quite goes away when a new one comes along. IBM is still selling mainframes. There are client/server networks inside many organizations across the world today and mobile/cloud will still exist. But it could change how we think about computing, how we build computers and how we write programs.
The time to start thinking about this is right now, before the change takes hold. After we are in the middle of it, the best ideas will already have been taken and it will be too late.