How does all the stuff in the world get connected, until humans live lives with the equivalent of “an angel on your shoulder,” an artificial intelligence that is pervasive, like your own thoughts?
And who the heck is going to build all that?
Such are the provocative questions that emerged during a Monday afternoon session on artificial intelligence at the Mobile World Congress trade show in Barcelona.
It was an absolutely packed auditorium, an already airless room becoming even more so, demonstrating there is a lot of interest in such questions.
Since this is a telecom show, the panel of entrepreneurs and academics threaded nimbly the connections between the emergent 5G networking technology, wearables, and something called “edge computing,” in a session dubbed “A.I. Everywhere.”
The panel’s moderator, Robert Marcus, general partner of Quantum Wave Capital, a Silicon Valley firm on the storied Sand Hill Road, talked of “massive” change that will come from enabling things.
Marcus’s point, as laid out in an initial slide, was that there was a burst of digital activity with Apple’s (AAPL) first iPhone, in 2007, which really took advantage of 4G networking with apps.
Now, he said, the advent of 5G will make possible edge computing, which will make possible “orders of magnitude” increases in compute, which will in turn make possible A.I. everywhere.
By way of background, it is increasingly clear 5G is more about connecting many devices, perhaps unmanned, such as factory robots, than it is about bringing greater speeds to human users of smartphones.
Sure, speed will rise up for users on Verizon Communications (VZ) or other networks. But the most novel technology enhancement that comes with 5G, something not even discussed in past, is a reduction in “latency,” the time it takes the first bit of a transmission to reach its destination.
Marcus’s apostle for the technical details was his first speaker, Mahadev Satyanarayanan, a professor at Carnegie Mellon University. His passion is about the emerging “tier” of edge computing, that sits between cloud computing, which is centralized, and all the billions of devices that will be connected in the world, including smartwatches and self-driving cars and on and on.
Satyanarayanan, who was referred to by Marcus as “Satya,” informed the audience he had been working on edge computing “since as long as there has been edge computing,” which sounded rather confusing given it seems like the term only popped up in the last two years.
In any event, Satya’s main point was that there needs to be something that’s not in the central facilities of Amazon (AMZN), or Alphabet’s (GOOGL) Google, or Microsoft (MSFT) Azure to interface with all the connected things, and for a variety of reasons.
“The ability to process without sending to the cloud is absolutely crucial” to the future of A.I., he said.