You are now following this Submission
- You will see updates in your followed content feed
- You may receive emails, depending on your communication preferences
To generate C/C++ code from LiteRT models, use the MATLAB Coder Support Package for PyTorch and LiteRT Models.
To run LiteRT models using the LiteRT Interpreter, use this software package.
The Deep Learning Toolbox Interface for LiteRT Library enables you to run cosimulations of MATLAB and Simulink applications with LiteRT (aka TensorFlow Lite or TFLite) models. This workflow enables you to use pretrained LiteRT models, including classification and object detection networks, with the rest of the application code implemented in MATLAB or Simulink for development and testing.
Inference of pretrained LiteRT models is executed by the LiteRT Interpreter while the rest of the application code is executed by MATLAB or Simulink. Data exchange between MATLAB or Simulink and LiteRT is handled automatically.
When used with MATLAB Coder, you can generate C++ code for the complete application for deployment to target hardware. In the generated code, inference of the LiteRT model is executed by the LiteRT Interpreter while C++ code is generated for the remainder of the MATLAB or Simulink application, including pre- and post-processing. Data exchange between the generated code and the LiteRT Interpreter is again handled automatically.
If you need to generate code from the LiteRT models alongside the pre and postprocessing, use the MATLAB Coder Support Package for PyTorch and LiteRT Models.
Please see the following list for a list of prerequisites for using this software package:
If you experience download or installation problems, please contact Technical Support:
MATLAB Release Compatibility
- Compatible with R2022a to R2026a
Platform Compatibility
- Windows
- macOS (Apple Silicon)
- macOS (Intel)
- Linux
