See how LEIP enables you to design, optimize and deploy AI to a variety of edge devices–at scale. Schedule a Demo

 

AI and the New Data Refinery for the Edge Continuum

by Jags Kandasamy | Posted Oct 08, 2020
Share

In 2017, The Economist published an article about The Fuel of the Future:  Data Giving Rise to a New Economy.  Today, this is more true than ever, and prospecting for new data, in the form of new applications and various sensor-driven data acquisition continues at a rapid pace.

According to Cisco, in 2020, we generate about 22 exabytes of data every four hours globally. This pace is not slowing down anytime soon, as the number of devices coming online is exponentially exploding.

Imagine this for a minute; according to Juniper Research, the number of voice assistants used to access smart home devices will reach 555M by 2024 from 105M in 2019, representing 5x growth in less than five years. As the number of units grows, the number of voice interactions grows — and this is just one sensory input mode.

Now consider the amount of data that the visual sensors start to collect. The current method of collecting data is piping it up to the massive data centers or the cloud. Is it feasible or scalable for the future to have your favorite cloud vendor send their container truck with petabytes of storage, just to haul it off somewhere?

So tying this back to the oil industry, most refineries are located closer to the oil rig, where the crude oil is converted into multiple end products at various stages in the distillation tower. We should consider data to value extraction at different layers, including the role AI will play here.  Here’s a great viewpoint by Jason Shepherd, VP Ecosystems at Zededa, Inc., about Pushing AI to the Edge and factors we need to consider.

The only way to extract value from the vast amount of data from different sensors is to apply AI at these different layers. Let’s consider this example to highlight this multi-stage value extraction. As in the oil and gas industry, the pipelines are usually snaking through an immense landscape. Each of the pipes is generally fitted with analog gauges to measure things like pressure and flow.  For example, by employing an edge AI-enabled surveillance drone to capture these analog gauge images, it’s possible to isolate the gauge images and send only that critical information to the next compute layer. Then only the localized needle data is processed for the exact reading. What results is actionable intelligence sent to the historian or the next process.

The above examples are just simple illustrations of using the different edge compute solutions to significantly lower the throughput of data while achieving highly valuable insights.

This data refinery will need to be composed of many different innovative solutions, but we believe the future of data includes intelligently and efficiently applying AI across the edge continuum.

P.S.  Don’t become a data hoarder.  Feel free to reach out to us at info@latentai.com for more help with your data hoarding issues!

Share
View All

Related