Overcome current limitations in terms of processing power, energy and cost to enable a new class of computer vision use cases at the edge delivering high performance outputs at a fraction of cost and energy consumption of existing solutions.
Ultralytics and STMicroelectronics teamed together to efficiently deploy YOLO models on low-power microcontrollers and achieve accurate and real-time inferences at the edge.
As AI adoption increases across industries, the demand for high-performance, low-power solutions capable of running real-time inferences at the edge is growing rapidly as well. To answer this technological need, STMicroelectronics introduced the STM32N6 microcontroller, featuring an integrated Neural Processing Unit (NPU) designed for embedded AI workloads.
By running Ultralytics YOLO models on the STM32N6, STMicroelectronics demonstrated that accurate and efficient embedded Vision AI is possible on microcontrollers, opening new opportunities for scalable, on-device intelligence across sectors like smart cities, healthcare, and consumer electronics.
STMicroelectronics is a global leader in semiconductor technology, with over 50,000 employees and more than 200,000 customers worldwide. They design and build chips that enable applications from electric vehicles and industrial equipment to smart home devices and consumer electronics.
As more industries turn to AI to make devices smarter and more responsive, STMicroelectronics has been focused on bringing those capabilities directly to the edge. For instance, their STM32N6 microcontroller, a powerful, energy-efficient chip, can handle on-device AI tasks like computer vision.
To help developers build embedded vision applications on STM32N6, STMicroelectronics looked for flexible, high-performing models that could run efficiently on a microcontroller. Ultralytics YOLO models turned out to be a great fit, offering a reliable combination of speed, accuracy, and ease of integration.
Before the concept of Edge AI became widely accepted, computer vision models were typically developed to run on large, centralized systems such as cloud servers or GPUs (Graphics Processing Units). These platforms offered the computing capabilities needed to train and deploy large models, but they also introduced limitations such as high energy consumption, network dependency, latency, and increased operational costs.
As interest grew in implementing smarter, real-time applications in industries like healthcare, consumer electronics, and smart cities, it became clear that pushing AI processing closer to where data is generated, on the device itself, was both a technical necessity and a strategic opportunity.
However, running AI models on low-power microcontrollers can be challenging. These devices generally have limited memory, computing power, and energy capacity, making it difficult to deploy complex vision models without compromising performance or accuracy.
STMicroelectronics needed to identify a suite of models versatile enough to bring reliable, real-time computer vision capabilities to their STM32N6 microcontroller, without requiring developers to drastically simplify their models or workflows. Their goal was to deliver meaningful on-device AI while staying within the strict constraints of embedded systems.
To enable advanced AI on low-power embedded devices, STMicroelectronics introduced the STM32N6, a high-performance microcontroller equipped with the Neural-ART Accelerator™. It is an in-house Neural Processing Unit (NPU) built specifically for edge AI workloads. This technology makes it possible for developers to run AI inferences directly on the device, reducing reliance on cloud computing while improving speed, responsiveness, and energy efficiency.
STMicroelectronics partnered with Ultralytics to evaluate and showcase STM32N6’s capabilities by running Ultralytics YOLO models on the microcontroller. Known for their balance of speed and accuracy, Ultralytics YOLO models are well-suited for resource-constrained environments and embedded deployments.
By running various YOLO model variants directly on the STM32N6, STMicroelectronics was able to demonstrate a range of Vision AI use cases, such as object detection, classification, and tracking, all within the power and memory limits of a microcontroller. This collaboration provides developers with a reliable option for deploying real-time, AI-powered embedded systems using scalable, production-ready vision models.
Ultralytics YOLO models offered STMicroelectronics the right combination of accuracy, efficiency, and versatility needed for AI-enabled embedded systems. The models are lightweight enough to run on low-power microcontrollers like the STM32N6, yet powerful enough to deliver real-time object detection and instance segmentation performance.
For example, when running the Ultralytics YOLOv8n model at 256 by 256 resolution on the STM32N6, the system reached 34 frames per second with each inference taking about 29 milliseconds. Power measurements showed it used only 9.4 millijoules per inference, making it well-suited for real-time vision tasks on low-power devices.
With support for multiple YOLO model variants, developers have the flexibility to fine-tune for speed, size, or accuracy depending on their application's constraints. The easy-to-integrate architecture, combined with strong community and documentation support, made Ultralytics YOLO a natural fit for STMicroelectronics’ goal of accelerating Vision AI adoption across a wide range of embedded use cases.
Through an Ultralytics Enterprise license, STMicroelectronics provides customers with access to the full suite of YOLO models for internal testing and development. However, for any commercial deployment, customers are required to request their own commercial license directly from Ultralytics via the license form. This ensures compliance and supports a scalable path to production-ready Vision AI solutions.
The ability to run Ultralytics YOLO models directly on the STM32N6 microcontroller has unlocked a wide range of Vision AI applications for STMicroelectronics and its developer ecosystem. By delivering fast, accurate inference on-device without relying on external processing or cloud connectivity, this solution makes it possible to deploy intelligent features in compact, low-power systems.
Customers are exploring use cases across sectors like real-time pedestrian and vehicle detection in smart city infrastructure, on-device safety checks and quality control in industrial automation, and AI-assisted diagnostics in portable healthcare tools. Similarly, in the consumer electronics space, YOLO models enable responsive features like presence detection, gesture recognition, and object tracking - all within the performance constraints of battery-operated devices.
As AI continues to evolve, STMicroelectronics is focused on making it easier to bring powerful, efficient solutions to edge devices. By working closely with partners like Ultralytics, they’re helping developers get started faster with ready-to-use models, tools, and STM32-compatible resources.
Take the next step in edge AI innovation. Visit our GitHub repository to discover how Ultralytics YOLO models are transforming embedded vision. Explore applications of AI in healthcare and computer vision in retail, and check out our licensing options today!