eIQ Machine Learning (ML) software
development environment leverages
inference engines, neural network compilers,
optimized libraries, deep learning toolkits
and open-source technologies for easier,
more secure system-level application
development and ML algorithm enablement,
and auto-quality ML enablement.
OVERVIEW
The NXP
®
eIQ (“edge intelligence”) ML software
environment provides the key ingredients to do inference
with neural network (NN) models on embedded systems
and deploy ML algorithms on NXP microprocessors and
microcontrollers for edge nodes. It includes inference
engines, NN compilers, libraries, and hardware abstraction
layers that support Google TensorFlow Lite, Glow, Arm
®
NN, Arm CMSIS-NN, and OpenCV.
With NXP’s i.MX applications processors and i.MX RT
crossover processors based on Arm Cortex
®
-A and M cores,
respectively, embedded designs can now support deep
learning applications that require high-performance data
analytics and fast inferencing.
eIQ software includes a variety of application examples that
demonstrate how to integrate neural networks into voice,
vision and sensor applications. The developer can choose
whether to deploy their ML applications on Arm Cortex A,
Cortex M, and GPUs, or for high-end acceleration on the
neural processing unit of the i.MX 8M Plus.
APPLICATIONS
eIQ ML software helps enable a variety of vision and sensor
applications working in conjunction with a collection of
device drivers and functions for cameras, microphones and
a wide range of environmental sensor types.
Object detection and recognition
Voice command and keyword recognition
Anomaly detection
Image and video processing
Other AI and ML applications include:
Smart wearables
Intelligent factories and smart buildings
Healthcare and diagnostics
Augmented reality
Logistics
Public safety
FEATURES
Open-source inference engines
Neural network compilers
Optimized libraries
Application samples
Included in NXP’s Yocto Linux
®
BSP and MCUXpresso SDK
software releases
eIQ MACHINE LEARNING
SOFTWARE DEVELOPMENT
ENVIRONMENT
FACT SHEET
eIQ™ ML SOFTWARE
www.nxp.com/eiqwww.nxp.com/eiq 2
NXP eIQ MACHINE LEARNING SOFTWARE - INFERENCE ENGINES BY CORE
eIQ™ Inference Engine Deployment (Public version; Subject to Change; 7/6/20)
NXP eIQ Inference
Engines and Libraries
CMSIS-NN
Compute Engines Cortex-M DSP Cortex-A GPU NPU
i.MX 8M Plus --- --- --- --- --- ---
i.MX 8QM --- --- --- NA NA
i.MX 8QXP --- --- --- NA NA
i.MX 8M Quad/Nano --- --- --- NA NA
i.MX 8M Mini --- --- --- NA NA NA NA
i.MX RT600 --- --- NA NA NA NA NA NA NA NA
i.MX RT1050/1060 NA NA NA NA NA NA NA NA NA
NA = Not Applicable
--- = Not Supported
OPEN-SOURCE INFERENCE ENGINES
The following inference engines are included as part of the
eIQ ML software development kit and serve as options for
deploying trained NN models.
Arm NN INFERENCE ENGINE
eIQ ML software supports Arm NN SDK on the i.MX 8 series
applications processor family and is available through the
NXP Yocto Linux-based releases.
Arm NN SDK is open-source, inference engine software that
allows embedded processors to run trained deep learning
models. This tool utilizes the Arm Compute Library to
optimize neural network operations running on Cortex-A
cores (using Neon acceleration). NXP has also integrated
Arm NN with proprietary drivers to support the i.MX GPUs
and i.MX 8M Plus NPU.
eIQ SOFTWARE FOR Arm NN
Neural Network (NN) Frameworks
Arm
®
NN
Arm Compute Library
Hardware Abstraction
Layer and Drivers
Cortex-A CPU
Verisilicon GPU and
Neural Processing unit
eIQ SOFTWARE FOR Arm
®
NEURAL NETWORK
GLOW
eIQ ML software supports Glow neural network compiler
on the i.MX RT crossover MCU family and is available in the
MCUXpresso SDK.
Glow is a machine learning compiler that enables ahead-
of-time compilation for increased performance and smaller
memory footprint as compared to a traditional runtime
inference engine. NXP offers optimizations for its i.MX RT
crossover MCUs based on Cortex-M cores and Cadence
®
Tensilica
®
HiFi 4 DSP.
eIQ FOR GLOW NEURAL NETWORK COMPILER
Host machine Target machine
Model Design and
Training, PC or Cloud
Neural Network
Model
Pre-Trained Model
Standard Formats
i.MX RT
Deploy
executable
code
Model Optimization
Model Compression
Model Compilation
GLOW AOT NN Compiler
• If available, generates ‘external
function calls’ to CMSIS-NN
kernels or NN Library
• Otherwise it compiles code
from its own native library
Inference
Arm
®
Cortex
®
-M Tensilica
®
HiFi 4 DSP
eIQ FOR GLOW NEURAL NETWORK COMPILER