NXP has launched an edge intelligence environment called eIQ that provides a comprehensive machine learning (ML) toolkit with support for TensorFlow Lite, Caffe2, and other neural network frameworks, as well as non-neural ML algorithms.
This will enable turnkey integrated ML solutions for voice, vision and anomaly detection applications, including data acquisition, trained models, with user feature customisation to use with NXP's EdgeScale software that provides secure device on-boarding, provisioning, and container management of ML applications targeting i.MX and Layerscape applications processors.
"Having long recognised that processing at the edge node is really the driver for customer adoption of machine learning, we created scalable ML solutions and eIQ tools, to make transferring artificial intelligence capabilities from the cloud-to-the-edge even more accessible and easy to use," said Geoff Lees, senior vice president and general manager of microcontrollers.
With support for NXP's full microcontroller (MCU) and applications processor product line, eIQ provides the building blocks that developers need to implement ML in edge devices. Keeping pace with ML's changing landscape, NXP eIQ is continuously expanding to include: data acquisition and curation tools; model conversion for a wide range of neural net (NN) frameworks and inference engines, such as, TensorFlow Lite, Caffe2, CNTK, and Arm® NN; support for emerging NN compilers like GLOW and XLA; classical ML algorithms (e.g. support vector machine and random forest); and tools to deploy the models for heterogeneous processing on NXP embedded processors.
NXP also recently introduced a software infrastructure called EdgeScale to unify how data is collected, curated, and processed at the edge, with focus on enabling ML applications. EdgeScale enables seamless integration to cloud-based artificial intelligence (AI) / ML services and deployment of cloud-trained models and inferencing engines on all NXP devices, from low-cost MCUs to high-performance i.MX and Layerscape applications processors.
Building on the eIQ environment, the company introduced turnkey solutions for edge-based learning and local execution of vision, voice, and anomaly detection models. These system-level solutions provide the hardware and software necessary for building fully functional applications, while allowing customers to add their own differentiation. The solutions are modular, making it easy for customers to expand functionality of their products with a simple plug-in. For example, a voice recognition module can be easily added to a product that has NXP's vision recognition solution.
Demonstrations include facial recognition training on high-performance i.MX 8QM and deployment of extracted inference engines on mid-range i.MX 8QXP and i.MX 8M applications processors using secure docker containers, as well as CMSIS-NN performance benchmarking using CIFAR-10 on just-announced LPC5500 MCUs and anomaly detection with classical machine learning techniques using Cortex-M4F based Kinetis MCUs.
Localized voice and vision ML applications include voice-enabled solution for localised wake word and end-user programmable voice control experience also using i.MX RT1050 crossover processor and vision systems with theAu-Zone DeepView ML Kit using i.MX 8QM implemented in a microwave oven and traffic sign recognition using low-cost i.MX RT 1050 crossover processor.