See Edge MLOps in Action Request a Demo
See Edge MLOps in Action Request a Demo
Our flagship product, the Latent AI Efficient Inference Platform™ (LEIP), is a Machine-Learning-Operations Software-Development-Kit (MLOps SDK) that can optimize and secure neural network runtimes. LEIP is a unified SDK built specifically for machine learning engineers and AI software developers. LEIP brings AI to the edge by optimizing for compute, energy, and memory without requiring changes to existing AI/ML infrastructure and frameworks. It comes with customizable templates called Recipes that are pre-qualified to your hardware with all the maintenance and configuration reduced to a single command line call. Recipes answer questions like: “How do I train a Yolo Model on the MSCOCO Dataset and run it on a Raspberry Pi?” Because there are no worries about the target platform, the iterations and model delivery are much faster.
LEIP compresses conventional AI models below 8-bit integer precision but without losing result accuracy, creating up to a 10x reduction in size and a 3x improvement in inference. And with Latent AI’s Adaptive AI technology, the model can self-regulate its computational needs, only firing the parts of the neural network necessary to get the job done.
With LEIP, ML engineers develop optimized neural network runtimes for heterogeneous low power hardware (CPU/GPU/DSP). Our Latent-AI-Runtime-Engine (LRE) offers a modular micro-service software stack for secured inference processing, with support for CI/CD and asset tracking across the entire model life-cycle.
Latent AI takes the hard work out of design and deployment by exploring thousands of candidate recipes that are pre-configured for your hardware. With hundreds of possible target models and hardware already in place, LEIP Recipes makes it easy to find the best optimized configuration that meets your design requirements.