Learn how the Extended Kalman Filter (EKF) handles non-linear systems for accurate object tracking and sensor fusion. Enhance your [YOLO26](https://docs.ultralytics.com/models/yolo26/) projects on the [Ultralytics Platform](https://platform.ultralytics.com).
The Extended Kalman Filter (EKF) is a robust mathematical algorithm designed to estimate the state of a dynamic system that behaves non-linearly. While the standard Kalman Filter (KF) provides an optimal solution for systems moving in straight lines or following simple linear equations, real-world physics is rarely that predictable. Most physical objects, such as a drone fighting wind resistance or a robotic arm rotating on multiple axes, follow curved or complex paths. The EKF addresses this complexity by creating a linear approximation of the system at a specific point in time, allowing engineers and data scientists to apply efficient filtering techniques to predictive modeling tasks even when the underlying mechanics are complicated.
To handle complex dynamics, the EKF employs a mathematical process called linearization, which essentially estimates the slope of a function at the current operating point. This often involves calculating a Jacobian matrix to approximate how the system changes over short intervals. The algorithm operates in a recursive loop consisting of two main phases: prediction and update. In the prediction phase, the filter projects the current state forward using a physical model of motion. In the update phase, it corrects this projection using new, often noisy data from sensors like gyroscopes or accelerometers. This continuous cycle of predicting and correcting helps reduce data noise and provides a smoother, more accurate estimate of the true state than any single sensor could provide alone.
В области компьютерного зрения (CV) расширенный фильтр Калмана играет важную роль в поддержании идентичности движущихся объектов. Передовые модели, такие как YOLO26, отлично справляются с обнаружением объектов в отдельных кадрах, но по своей сути не понимают непрерывность движения во времени. Благодаря интеграции EKF или аналогичной логики, система отслеживания объектов может предсказать, где должна появиться ограничительная рамка в следующем видеокадре, исходя из предыдущей скорости и траектории объекта. Это особенно полезно для обработки окклюзий, когда объект временно скрыт от вида; фильтр сохраняетtrack, оценивая положение объекта до тех пор, пока он не станет снова видимым, что является важной техникой для надежного отслеживания нескольких объектов (MOT).
The versatility of the EKF makes it a cornerstone technology in various high-tech industries where machine learning (ML) intersects with physical hardware:
Чтобы понять специфическую полезность расширенного фильтра Калмана, полезно отличать его от связанных методов фильтрации :
В ultralytics package, tracking algorithms use Kalman filtering concepts internally to smooth
trajectories and associate detections across frames. While you do not manually code the EKF matrix math when using
high-level tools, understanding that it powers the tracker helps in configuring parameters for the
Платформа Ultralytics.
Here is how to initiate a tracker with a YOLO model, which utilizes these filtering techniques for state estimation:
from ultralytics import YOLO
# Load the latest YOLO26 model (nano version for speed)
model = YOLO("yolo26n.pt")
# Track objects in a video source
# Trackers like BoT-SORT or ByteTrack use Kalman filtering logic internally
results = model.track(source="https://ultralytics.com/images/bus.jpg", tracker="botsort.yaml")
# Print the ID of the tracked objects
for r in results:
if r.boxes.id is not None:
print(f"Track IDs: {r.boxes.id.numpy()}")