Edge is Where the Opportunity is and Why AI Needs to Evolve
This is a 2-part series on the impact of AI on the edge as compared to the core, and why the approach to building AI algorithms needs to change if they are to operate more efficiently on the edge
From enabling autonomous vehicles to finding the cure for cancer to making that consumer interaction with the cable company productive, artificial intelligence has been touted as the silver bullet for a whole slew of our business problems. It is no accident that AI is seeing such momentum after being stuck in the doldrums for so long. After all, the term AI was coined way back in 1956 when formal research in this area first started.
What Changed and Why so Rapidly?
There are 3 trends that are merging to create a perfect storm of opportunity for AI:
- Cloud computing is now ubiquitous and price-competitive to boot
- Devices are proliferating — think connected vehicles, personal phones, bio/fitness trackers or industrial IoT sensors and data loggers. IDC estimates that the number of connected devices will increase from 23 Billion in 2018 to 75 Billion in 2025
- Companies globally are realizing the value of transforming themselves digitally.
So, for the first time, we have a situation where there is lots of data, an inexpensive way to store and analyze that data and a business imperative to extract value from it all. The only missing thing was how to do it technically, and that is where AI (and AI-led automation) complete the puzzle.
Say hello to HAL 2.0. Not quite the sentient being we were hoping for, but an eminently more useful system nonetheless.
Cloud First to Cloud Only
There is no other way to put it, we are living in a cloud economy right now. Gartner estimates that through 2022, the cloud services industry will grow at 3x times the growth of overall IT services. By the end of 2019, just a few short months from now, more than 30 percent of technology providers’ new software investments will shift from cloud-first to cloud-only. Moving enterprise workloads to the cloud remain among the top 3 investment priorities for over a third of the companies surveyed. So much so that even the legacy ERP providers like SAP are now cozying up to cloud giants as they continue their evolution in a cloud-centric world.
Core Versus Edge
IDC estimates that by 2025, 49% of the world’s data will be stored in public clouds, like those of Google or AWS. And this where it gets interesting. Currently, the vast majority of the world’s “datasphere” resides in the core, in central servers managed by the likes of AWS or in enterprise data centers. But the growth in data creation at the edge is far outpacing that in the core. For example, all of the 75 billion devices mentioned earlier will be generating data on a 24-hour basis and much of it needing to be processed right there on the edge.
While most of this data will be processed using models generated in the core, only some of it will be sent to a central server for actual processing. In fact, there are many industry-specific use cases where it will not make economic or business sense to process that data in the cloud versus doing it right there at the edge, where it was generated in the first place.
And this is exactly the problem and why the current approach to AI needs to evolve.
While the cloud offers us economies of scale, computing elasticity, and significant process automation, there is still that pesky issue of physics that doesn’t quite go away. There are physical constraints such as power consumption, memory capacity and network connectivity that limit what we can and cannot do with this amazing cloud. There are also non-trivial business constraints that limit what we can do with the cloud as well. If you want to display ads for personalized merchandise to a consumer walking inside a retail store, you only have a few seconds to do so. You don’t have the luxury of sending the data to a central repository, analyzing it and then displaying it back to the customer. By then he/she has walked away, and the sale opportunity is lost. Similarly, if you are in a factory environment and the IoT device has picked up some strange noises that the machinery is making, you need to take immediate remedial action without any delay or risk incurring significant damage to the machinery.
It is clearly evident that edge devices, as constrained as they are for memory, power, and processing capacity, need to be made smarter so they can do the things that previously needed huge centralized servers. In order to do that, AI workflows need to be simplified so that AI can be embedded into these devices without needing to have legions of data scientists working on multiple iterations of the same model before it can be successfully deployed.
And that is where Latent AI comes in.
Introducing Latent AI
Latent AI is an early stage venture spinout of SRI International whose mission is to enable developers and change the way we think about building AI for the edge. Our first product is designed to help companies add AI to edge devices and to empower users with new smart IoT applications.
Some of the challenges that Latent AI has successfully addressed with its unique technology include:
- How to enable resource-constrained companies to train and deploy AI models for the edge, and do so efficiently and cost-effectively?
- How to democratize AI development so that developers can build new edge computing applications without worrying about resource constraints on their target platforms (e.g. size, weight, power, and cost)?
- How to dynamically manage AI workloads that can dynamically tune performance and reduce compute requirements appropriately?
In the next edition of this 2-part blog series, we will dig into some of the details of how Latent AI democratizes AI development for the edge, and share some examples of how Latent AI technology will radically transform the AI landscape as we know it.
As Arthur C. Clarke once said “I don’t pretend we have all the answers. But the questions are certainly worth thinking about.”
Join the conversation and/or follow me on Twitter @BansalManju.
Manju Bansal, Advisor, Latent AI, Inc.