What We Do

We take the hard work out of AI processing on the edge.  Latent AI’s LEIP platform brings AI to the edge by optimizing for compute, energy and memory without requiring changes to existing AI/ML infrastructure and frameworks.

Who We Are

We’re an early stage venture spinout of SRI International, well-funded by industry-leading investors with support from Fortune 500 clients. Our seasoned team has many years of experience in machine learning, AI, computer vision, embedded systems, IoT applications, and high performance computing.

How Latent AI got its start:  Our Founder’s Story

What We Create

Latent AI Efficient Inference Platform (LEIP) is a modular, fully-integrated workflow designed to train, quantize and deploy edge AI neural networks. Learn more about our available tools LEIP Compress and LEIP Compile below.

Latent AI's Technologies Benefit Your Edge AI Development

null

Adaptive

Dynamically throttles accuracy for compute efficiency
null

Hardware Agnostic

Supports any processor platform for edge and server
null

Dynamic AI Workload

No math intensive ops, setup processing resources at runtime
null

Lower Memory Use

Ultra compact footprint for deep learning
Read our Blog Post:   The Next Wave in AI and Machine Learning:  Adaptive AI at the Edge

Introducing

 

Latent AI Efficient Inference Platform (LEIP)

LEIP is a modular, fully-integrated workflow designed to train, quantize and deploy edge AI neural networks.   LEIP Compress and LEIP Compile are available now, with more module capabilities on the way in the near future.

Who Should Use LEIP?

LEIP is designed for AI, embedded and software application developers to easily enable, deploy and manage AI for the edge.

 

 

 

Module Benefits

LEIP Compress

Tooling designed for edge AI use cases LEIP Compress is a state-of-the-art quantization optimizer that supports both post training and training aware quantization.

  • Compresses neural networks to balance the optimization of performance and resource usage based on the specification of the target hardware.
  • Time savings from removing unnecessary iterations required from traditional methods of tuning and pruning.
  • Cost savings from reducing specialized personnel dedicated to quantizing and optimizing AI models.

 

Are you ready to take the LEIP?

Test drive LEIP Compress

LEIP Compile

LEIP Compile optimizes neural network processing for hardware processor targets.

  • The first product with integrated Deep Neural Network training and compiler framework
  • A seamless end-to-end integrated workflow, from ML training framework to an executable binary running on target edge AI hardware
  • Full optimization to compress library files by 10x while compiling Deep Neural Networks in a matter of hours while running 5x lower latency

 

Learn more about LEIP Compile

Get Access

Latent AI Example Use Cases

null

Collision Detection for Robots

Healthcare

null

Active Noise Cancelling

Consumer Electronics

null

Pressure and Vibration Analysis

Industrial Manufacturing

null

Collision Detection for Robots

Hospitality

null

Remote Oil & Gas platform Control

Industrial Manufacturing

null

Intruder Alert System for Home Surveillance

Consumer Electronics

null

Baby Monitoring System

Consumer Electronics

null

Sentiment analysis of Shoppers in retail shelves

Retail

null

Viewer analysis of TV shows

Consumer Electronics

null

Factory Production Quality Assurance

Industrial Manufacturing

null

Augumented Reality Wearables

Multipurpose

null

Drone Surveilance

Multipurpose

null

Health trackers - Wearables

Consumer Electronics

null

Efficient data collection through analysis on device

Industrial Manufacturing

null

Low-high res video on device with reduced silicon requirements

Consumer Electronics

Time Critical Inference
null

Collision Detection for Robots

Healthcare

null

Active Noise Cancelling

Consumer Electronics

null

Pressure and Vibration Analysis

Industrial Manufacturing

null

Collision Detection for Robots

Hospitality

Limited to No Network Access
null

Remote Oil & Gas platform Control

Industrial Manufacturing

null

Intruder Alert System for Home Surveillance

Consumer Electronics

null

Baby Monitoring System

Consumer Electronics

Security & Privacy
null

Sentiment analysis of Shoppers in retail shelves

Retail

null

Viewer analysis of TV shows

Consumer Electronics

null

Factory Production Quality Assurance

Industrial Manufacturing

Resource Constrained Environments
null

Augumented Reality Wearables

Multipurpose

null

Drone Surveilance

Multipurpose

null

Health trackers - Wearables

Consumer Electronics

Efficiency/Cost Savings
null

Efficient data collection through analysis on device

Industrial Manufacturing

null

Low-high res video on device with reduced silicon requirements

Consumer Electronics

Leadership Team

Executive Team

Jags Kandasamy
Co-Founder and CEO
Sek Chai
Co-Founder and CTO
Chloe Chan
CFO and COO
Mark Griffin
Director, System Architecture

Board Directors and Advisors

Raghu Madabushi
Board Director, Sr. Dir SRI Ventures and GE Ventures
Bruce Graham
Board Director
and Senior Advisor
Alan Boehme
Advisor
Global CTO, P&G
Steve Dyer
Advisor
Former HPE CTO, Enterprise Security Products
Manju Bansal
Advisor
Former VP Marketing at SAP

We’re hiring!  Visit our Careers page to learn more or email us at careers@latentai.com.

Latest Latent AI News

Scaling Edge AI in a Data-Driven World, Part One

AI’s Shocking Carbon Footprint

The Era of Commercializing AI, Part Two

The Era of Commercializing AI, Part One

Latent AI, Adaptive AI Optimized for Compute, Energy and Memory

AI Moves to the Edge

It’s Time for Adaptive AI to Enable a Smarter Edge

Latent AI, Inc. Announces Seed Funding Led by Future Ventures

The Next Wave in AI and Machine Learning: Adaptive AI at the Edge

Get in Touch!