The future of AI is on the edge
Artificial Intelligence (AI) is coming to every device near you, according to Samsung NEXT Ventures managing director Brendon Kim. During a conversation with MIT Technology Review’s Elizabeth Bramson-Boudreau at the EmTech Digital conference in San Francisco, Kim said the next big thing would be AI operating at the edge.
The “edge” in this case refers to all the devices and processors that we interact with every day of our lives, as opposed to the remote servers that constitute the cloud. Data processing at the edge will mean not having to connect to a cloud server, which in turn will enable data processing and analysis to happen in real time.
The cloud has revolutionized everything from office productivity to entertainment-on-demand. Yet, when it comes to artificial intelligence, it has some serious limitations, especially for low-latency applications. Self-driving cars, for example, can’t afford to wait on remote servers for instructions that need split-second timing to avoid accidents.
That could change in the near future as powerful new AI technologies are designed to operate within all the phones, laptops, and appliances around us — all without having to connect to processors in the cloud.
“AI is going to be embedded in everything that we do, everything that we touch, and everything that we use,” Kim said. Making that possible requires major new investments in everything from processors to networking technology as well as software—investments that Samsung is now making.
“Outside of AI, a lot of what needs to be done is in edge computing investments,” according to Kim.
Samsung’s $22 Billion Bet
Samsung plans to spend $22 billion on advanced tech by 2022, with AI and 5G heading the list. As part of its research initiative, Samsung plans to add 1,000 scientists at AI-dedicated research centers around the world.
Kim provided some insight into what Samsung’s researchers are currently working on. Their work involves a paradigm shift that will help AI move from the cloud to the edge.
“We still care about AI at the center. We care about AI in the cloud. But we’re much more interested in AI at the edge. At the edge, you have much less compute power and much less data. But because of use cases like autonomous driving, you still need to have good decision-making at the edge. So what can you do with less data and less compute power at the edge?” Samsung researchers and Samsung NEXT investors are looking for the answer.
Most AI applications depend on deep learning algorithms that analyze tens of thousands to millions of pieces of training sets. And while deep learning has a place in AI’s evolution, not every problem can be solved by deep learning. That requires computing power that may not readily fit onto affordable consumer-grade devices.
Moreover, deep learning software still has a long way to go before it can come close to matching the ability of the human brain to recognize handwriting or objects in unfamiliar environments, according to Kim. Because conventional software requires computing power that edge devices lack and due to their inherent limitations, Samsung wants to go beyond the usual approach to deep learning.
One possible direction: AI that learns not from millions of samples in a database but from observing human trainers thanks to advances in the fields of deep reinforcement learning and imitation learning.
As part of its investment in this approach, Samsung has funded a startup called Covariant. Engineers at the UC Berkeley spinoff are teaching industrial robots conceptual tasks that can be applied in a wide range of situations. Such learning improves on the current practice of programming rote actions that AI-driven robots can only perform the same way every time.
In the area of hardware, Kim said Samsung research teams are working on processors optimized for AI. “We sort of fell into general processing units, known as GPUs, because that was what we had, Kim said. “That’s helped us tremendously in AI, but it wasn’t purpose-built for AI.”
New chip architectures, Kim said, will help accelerate AI’s expansion beyond the cloud with more effective, yet more efficient processors that are better suited to life on the edge in that they can do more with less compute power and memory than their general-purpose counterparts.
Kim also anticipates such edge-adapted, AI-optimized processors to get a boost from orchestration — that is, the coordination of separate processors in a given vicinity.
An orchestra of devices
Imagine a future in which every device in your vicinity draws on the resources of every other device around you to form a system stronger than the sum of its parts.
Such orchestration would allow, for example, a TV to draw on the computing power of a phone in the same room to run a game. That’s the vision Kim articulated for the future of edge computing, which he said Samsung is actively perusing.
“One of the exciting things for us at Samsung is that we have such a very large IP footprint at the edge,” said Kim. “We particularly like it because it plays to our strength. We don’t have this big cloud infrastructure, but we have more devices at the edge than just about anybody.”
Orchestration could even extend beyond the devices owned by a given person. “There’s lots of excess compute power at the edge,” said Kim. “All of us in this room have a mini-supercomputer in our pockets or in our hands, but not 100% of that is being utilized.” What if, Kim asked, you could earn incentives for sharing that power with other consumers or even companies?
The resulting mini clouds—formed of edge devices owned by multiple people or even companies—could combine the low-latency benefits of computing on the edge with some of the brute computing power of the cloud, bringing us the best of both worlds.
It’s all in the service of taking AI everywhere and bringing it everything. Samsung, with its large base of devices, including phones, TVs, tablets, wearables and more, along with a strong commitment to invest in computing at the edge, seems well-positioned to succeed.
“We sell over 500 million devices each year,” said Kim. “We’re committed to making every one of those devices intelligent in some way.”