i.MX Machine Learning User's Guide
●The NXP® eIQ™ Machine Learning Software Development Environment (hereinafter referred to as "NXP eIQ") provides a set of libraries and development tools for machine learning applications targeting NXP microcontrollers and application processors. The NXP eIQ is contained in the meta-imx/meta-ml Yocto layer. See also the i.MX Yocto Project User's Guide (IMXLXYOCTOUG) for more information.
●The following six inference engines are currently supported in the NXP eIQ software stack: Arm NN, TensorFlow Lite, ONNX Runtime, PyTorch, OpenCV, and DeepViewTMRT. The following figure shows the supported eIQ inference engines accross the computing units.
●The NXP eIQ inference engines support multi-threaded execution on Cortex-A cores. Additionally, Arm NN, ONNX Runtime,TensorFlow Lite, and DeepViewRT also support acceleration on the GPU or NPU through Neural Network Runtime (NNRT). See also eIQ Inference Runtime Overview.
●Generally, the NXP eIQ is prepared to support the following key application domains:
◆Vision
▲Multi camera observation
▲Active object recognition
▲Gesture control
◆Voice
▲Voice processing
▲Home entertainment
◆Sound
▲Smart sense and control
▲Visual inspection
▲Sound monitoring
|
|
|
|
User's Guide |
|
|
|
Please see the document for details |
|
|
|
|
|
|
|
English Chinese Chinese and English Japanese |
|
30 September 2021 |
|
Rev. LF5.10.52_2.1.0 |
|
|
|
2.8 MB |
- +1 Like
- Add to Favorites
Recommend
All reproduced articles on this site are for the purpose of conveying more information and clearly indicate the source. If media or individuals who do not want to be reproduced can contact us, which will be deleted.