See Edge MLOps in Action Request a Demo
See Edge MLOps in Action Request a Demo
We previously explained Latent AI technology and how it delivers optimized edge models quickly and reliably by comparing it to the Iron Chef competitive cooking show. We’ve talked about how Iron Chef and Latent AI Recipes can be modified to meet regional tastes or specific hardware, respectively. We also touched on what it means to deliver optimized dishes and models at scale. So far, we’ve talked about the Iron Chef, but not his support staff. The truth is that the Iron Chef is not alone. In an Iron Chef’s restaurant, there are sous chefs, kitchen managers, and a host of others who work together to create a meal and experience. Each day, the staff meets to set the menu and other operational tasks so that the kitchen can work to maximum efficiency.
Similarly, the data scientist is not alone. In enterprises large and small, there are staff members including the data scientist, ML engineer and application developer, who work together to produce data-driven solutions. These solutions are powered by AI algorithms that are modeled, trained, and deployed in an MLOps (ML Operations) pipeline. There are also business and logistics staff that deal with testing, deployment, and product requirements. The data scientist is concerned about the model accuracy, the ML engineer is focused on the performance (speed and size) of the model runtime, and the application developer is focused on integrating the model runtime with sensor drivers and the overall software stack. Each day, the AI staff meets to set scrum goals in an agile development flow.
Recipes are used by the Iron Chef staff to create the best Iron Chef dishes, and in a similar way, Latent AI Recipes are used in an AI factory to detail how “AI dishes” are built. The AI recipe captures the entire MLOps for the data scientist, ML engineers, and application developers with respect to model development. The AI Chef uses our LEIP SDK to design, train, and optimize AI models for any target hardware within given resource constraints. The AI Recipes guide the AI team to design and develop Edge AI solutions with reduced power, latency, and memory footprints while maintaining model accuracy.
Within the AI factory, the AI Recipe describes the interface between the AI staff. Specifically, the AI Recipe describes the model input and output specifications, as defined by the data scientist. That information is then used by the ML engineers to set optimization parameters for a hardware target. Similarly, the application developers use the information to set sensor configurations and other application settings. As part of a CI/CD (continuous integration, continuous deployment) approach, the AI Recipes define how the model is evaluated/tested as a model and a compiled runtime engine.
Extending the Iron Chef analogy again, the AI Chef leverages AI Recipes as a specification for the AI factory to operate smoothly and efficiently. Each recipe is used as a blueprint for a factory line within the AI factory. For example, the AI factory can be customized through recipes to build models for classification, detection, and segmentation. Each AI staff member knows the expected handoff point, and the overall flow.
Latent AI Recipes help reduce the time to market with a reproducible MLOps workflow. Recipes help the AI factory scale quickly, with many combinations of models and hardware targets immediately available to train and deploy with custom data. Each model trained and optimized using Recipes can be done in a deterministic manner to achieve consistent results. MLOps is complicated, and a model in production needs to be updated over time. A deployed model may break when it encounters scenarios that are not available during training. If the AI factory can’t repeat or reproduce a result, then it is game-over. With Latent AI LEIP Recipes, you can scale up the deployment of Edge AI across different product requirements.
To learn more about how LEIP Recipes is an integral part of an AI factory MLOps, contact us at firstname.lastname@example.org.