Yolo Vision Shenzhen
Shenzhen
Join now

Key highlights from Ultralytics at Embedded World 2026

Join us as we look back at Ultralytics’ experience at Embedded World 2026, showcasing Ultralytics YOLO26 running on edge devices through various live demos.

Scale your computer vision projects with Ultralytics

Get started

Recently, the Ultralytics team has been busy connecting with the vision AI community at events around the world. One of those stops was the Embedded World Exhibition & Conference, held from March 10 to 12 at NürnbergMesse in Nuremberg, Germany.

This event is one of the leading global gatherings for the embedded systems community, bringing together engineers, hardware vendors, software developers, and industry experts to explore the latest innovations in embedded computing and edge technologies.

With an audience of more than 32,000 attendees, Embedded World creates a space for companies and developers to share ideas, discover new technologies, and discuss the future of embedded systems. As edge AI and embedded vision continue to gain momentum, Embedded World has become an important meeting point for teams building intelligent systems that run directly on devices. 

This year, Glenn Jocher, our Founder & CEO, Paula Derrenger, our VP of Growth, Francesco Mattioli, our Lead Partnerships Engineer, and our Account Executives Alex Wong and Jake Qian represented Ultralytics at the event. Over three days, the team connected with partners, explored the expo halls, and showcased Ultralytics YOLO26 running on a range of embedded hardware platforms.

Fig 1. The Ultralytics team at Embedded World 2026

In this recap, we’ll walk through some of the key highlights from our time at Embedded World 2026. Let’s get started! 

An overview of Embedded World 

Embedded World is a major international event dedicated to embedded systems and the technologies that power them. Held each year at NürnbergMesse in Nuremberg, Germany, this exhibition and conference brings together companies and professionals working across embedded computing, electronics, and edge AI.

The event features a mix of exhibition halls, technical conferences, expert panels, and networking sessions. Throughout the event, engineers, suppliers, developers, and technology companies share insights, feature new hardware platforms, and discuss how embedded systems are evolving.

In recent years, the conversation at Embedded World has increasingly centered around edge computing and embedded vision. As AI models become more efficient, running computer vision directly on devices is becoming more practical across industries.

This has made the event an important platform for the broader AI ecosystem. Hardware manufacturers, AI companies, and embedded developers connect to collaborate, exchange ideas, and explore how intelligent systems can run directly on devices at the edge.

A recap of Ultralytics at Embedded World in 2025

Our team was excited to return to Embedded World this year after a great experience at the event in 2025. Last year, we attended Embedded World 2025 in Nuremberg, where the expo floor was filled with examples of embedded vision in action. 

Interestingly, many hardware manufacturers were using Ultralytics YOLO models to demonstrate real-time computer vision on their latest platforms, from compact development boards to advanced edge AI accelerators.

Later in the year, we also joined Embedded World North America in Anaheim, California. There, we connected with engineers, developers, and hardware teams exploring how to deploy computer vision in real-world environments where efficiency, power consumption, and fast response times are critical.

Across both events, it was exciting to see how widely Ultralytics YOLO models are being used to power embedded vision applications. From robotics and automation to smart devices and industrial systems, these experiences highlighted how quickly vision AI is moving onto edge devices.

Key trends at Embedded World 2026

Similar to previous years, Embedded World 2026 illustrated several key trends shaping the embedded and edge AI ecosystem. 

Here’s an overview of some of the trends that stood out during the event:

  • Embedded vision: Computer vision is becoming a core capability in embedded systems. Many demos highlighted how devices can analyze visual data in real time for tasks like inspection, monitoring, and automation.
  • Efficient edge AI hardware: Running vision AI models directly on small, power-efficient devices was a major focus. Many companies showcased hardware platforms designed to support real-time AI inference without relying on cloud infrastructure.
  • AI hardware accelerators: Dedicated accelerators built specifically for AI workloads are becoming increasingly common. These chips help improve performance while keeping power consumption low, making them great for edge deployments.
  • AI - hardware ecosystem collaboration: The event also shone a light on collaboration between AI software platforms and embedded hardware manufacturers. These partnerships make it easier to deploy AI models across different devices.
  • Real-world applications: Many demos focused on practical uses of embedded vision across industries such as robotics, manufacturing, and healthcare. These examples showed how edge AI is being applied to solve real-world challenges.

Ultralytics at Embedded World in 2026

Next, let’s take a closer look at Ultralytics’ experience at Embedded World 2026 and some of the highlights from the event.

Setting up the booth before the event

Before the exhibition officially opened, the team arrived early to set up the booth and prepare the demos for the days ahead. Several edge AI demonstrations were prepared to showcase Ultralytics YOLO models, including YOLO26, running real-time computer vision inference on embedded hardware platforms.

Fig 2. A glimpse of the Ultralytics YOLO demos at our booth

Hardware systems were arranged across the booth, and the demos were configured and tested to make sure everything ran smoothly. After the final checks, the team wrapped up the day eager to welcome visitors to the booth.

Day one: Live demos and workshops with collaborators

The first day of the event was filled with conversations and live demos at the Ultralytics booth. Visitors had the chance to see Ultralytics YOLO models running in real time across a variety of embedded hardware platforms, including systems from Axelera AI, Intel, AAEON, STMicroelectronics, DEEPX, Raspberry Pi, and Hailo. 

The demos showcased computer vision tasks such as object detection and pose estimation in action.

Fig 3. An example of Ultralytics YOLO models running on DEEPX hardware

Later in the afternoon, Francesco Mattioli, our Lead Partnerships Engineer, took part in a workshop “Accelerating Vision AI on Intel Core Ultra with OpenVINO & Ultralytics YOLO,” organized in collaboration with Intel and AAEON. During the session, he demonstrated how Ultralytics YOLO models can run on Intel® Core™ Ultra processors using the OpenVINO™ toolkit, highlighting how optimized hardware and software can support efficient computer vision directly on embedded devices.

Fig 4. Francesco Mattioli presenting during the workshop

Overall, embedded vision was one of the most talked-about topics throughout the day as developers explored new ways to bring AI capabilities directly to embedded platforms.

Day two: Connecting with the edge AI ecosystem

The second day of Embedded World 2026 was largely focused on connecting with partners and continuing conversations across the embedded AI ecosystem. Throughout the exhibition halls, the team met with several partners, including Axelera AI and DEEPX, to discuss ongoing collaborations and explore new opportunities around edge AI deployments.

Many of these discussions centered on how computer vision models can be integrated with embedded hardware platforms to support real-world applications. From hardware acceleration to optimized inference at the edge, the conversations reflected the growing interest in running AI directly on devices.

Another interesting moment came during the D-Robotics networking session, where developers, hardware companies, and industry experts gathered to connect and exchange ideas. During the session, Francesco Mattioli delivered a talk on embedded vision and the role of AI in smart edge systems.

Later in the evening, the team joined an Edge AI Networking Dinner, where partners and developers gathered to continue conversations about the future of AI at the edge in a more relaxed setting.

Day three: Exploring the exhibition floor

On the final day, the team took time to explore the wider exhibition halls and reconnect with partners across the show floor. It was a great way to see how different companies are integrating AI into embedded systems and to observe the broader trends shaping the industry.

One of the highlights was seeing Ultralytics YOLO models showcased across several partner booths throughout the expo. Many companies were demonstrating real-time computer vision applications running on embedded hardware, showing how vision AI is becoming a practical tool for edge deployments.

A variety of industries were represented in these demos, including healthcare, robotics, automotive, and manufacturing. These examples illustrated how embedded vision systems are being applied to solve real-world challenges, from improving automation in factories to enabling smarter robotic systems and supporting advanced monitoring solutions in healthcare environments.

Key takeaways

Embedded World 2026 was a great opportunity for the Ultralytics team to connect with partners, showcase Ultralytics YOLO26 on embedded hardware, and explore the rapidly growing edge AI ecosystem. Across workshops, live demos, and conversations on the expo floor, it was exciting to see how quickly vision AI is moving onto edge devices. Thanks to everyone who stopped by to connect with us, and we look forward to continuing to build the future of computer vision together.

Join our growing community! Explore our GitHub repository to learn more about AI. Discover AI applications like computer vision in manufacturing and AI in logistics by visiting our solution pages. To start building with computer vision, check out our licensing options.

Let’s build the future of AI together!

Begin your journey with the future of machine learning