Prerequisites for Deep Learning with TensorFlow Lite Models
MathWorks Products
To perform inference with TensorFlow™ Lite models in MATLAB® execution, or using MATLAB Function blocks in Simulink® models, you must install:
Deep Learning Toolbox™
Deep Learning Toolbox Interface for TensorFlow Lite
In addition, to generate code for TensorFlow Lite models, you must also install MATLAB Coder™.
Third-Party Hardware and Software
Deployment Platform | MATLAB host computer or ARM® processor |
Software Libraries | TensorFlow Lite version 2.4.1 on host computer or target. For information on building the library, see this post in MATLAB Answers™: https://www.mathworks.com/matlabcentral/answers/1631265. Supported models include:
Multi-input networks are not supported. TensorFlow Lite models are forward and backward compatible. So, if your model was created using a different version of the library but contains layers that are available in version 2.4.1, you can still generate code and deploy your model. |
Operating System Support | Linux® only. CentOS and Red Hat® distributions are not supported. |
C++ Compiler | MATLAB Coder locates and uses a supported installed compiler. For the list of supported compilers, see Supported and Compatible Compilers on the MathWorks® website. You can use The C++ compiler must support C++11. |
Environment Variables
MATLAB Coder uses environment variables to locate the libraries required to generate code for deep learning networks.
For deployment on the MATLAB host computer set these environment variables on the host:
TFLITE_PATH
: Location of the TensorFlow Lite library directory.LD_LIBRARY_PATH
: Location of the run-time shared library. For example,TFLITE_PATH/lib/tensorflow/lite
.
For deployment on ARM processor, set these environment variables on the target hardware board:
TFLITE_PATH
: Location of the TensorFlow Lite library directory.LD_LIBRARY_PATH
: Location of the run-time shared library. For example,TFLITE_PATH/lib/tensorflow/lite
.TFLITE_MODEL_PATH
: :Location of the TensorFlow Lite model that you intend to deploy.
See Also
loadTFLiteModel
| predict
| TFLiteModel