Latent AI's Technologies Benefit Your Edge AI Development
Introducing
Latent AI Efficient Inference Platform (LEIP)
LEIP is a modular, fully-integrated workflow designed to train, quantize and deploy edge AI neural networks. LEIP Compress and LEIP Compile are available now, with more module capabilities on the way in the near future.
Who Should Use LEIP?
LEIP is designed for AI, embedded and software application developers to easily enable, deploy and manage AI for the edge.
Module Benefits
LEIP Compress
Tooling designed for edge AI use cases LEIP Compress is a state-of-the-art quantization optimizer that supports both post training and training aware quantization.
- Compresses neural networks to balance the optimization of performance and resource usage based on the specification of the target hardware.
- Time savings from removing unnecessary iterations required from traditional methods of tuning and pruning.
- Cost savings from reducing specialized personnel dedicated to quantizing and optimizing AI models.
Are you ready to take the LEIP?
Test drive LEIP Compress
LEIP Compile
LEIP Compile optimizes neural network processing for hardware processor targets.
- The first product with integrated Deep Neural Network training and compiler framework
- A seamless end-to-end integrated workflow, from ML training framework to an executable binary running on target edge AI hardware
- Full optimization to compress library files by 10x while compiling Deep Neural Networks in a matter of hours while running 5x lower latency
Learn more about LEIP Compile
Get Access
Latent AI Example Use Cases















Leadership Team
Executive Team
Board Directors and Advisors
We’re hiring! Visit our Careers page to learn more or email us at careers@latentai.com.
“SRI International has long been at the forefront of research in the rapidly evolving field of, machine learning, computer vision, and robotics. We are proud to share SRI’s deep expertise and cutting-edge research in these areas with Latent AI to accelerate a solution that will bring the visionary promise of computing and AI to real-world applications.”
Manish Kothar, Ph.D., SRI International President
“The edge needs AI, and AI needs the edge. LatentAI integrates both with a portfolio of IoT edge compute optimizers and accelerators that bring an order of magnitude improvement to existing infrastructure. This is essential as the majority of new software today is AI and most compute cycles will shift to the edge.”
Steve Jurvetson, Founder and Manager Partner, Future Ventures
“The rapid evolution of artificial intelligence has led to a redefining of performance requirements at the edge. Jags Kandasamy and his team at Latent AI have demonstrated significant expertise in their ability to optimize edge performance without changing existing AI infrastructure. We look forward to working with LatentAI as the team continues to execute on its vision.”
Anurag Jain, Managing Partner, Perot Jain
“I provided funding to this group when they were at SRI and I was a DARPA Program Manager. I was impressed with their approach to Deep Learning. I am excited to see that they will be commercializing the technology. It is a high quality team that has been doing some of the best work in both low precision and temporal based Deep Learning.”
Dr. Dan Hammerstrom, Professor Emeritus, Portland State University, and former DARPA Program Manager