Subjective lameness assessment varies between practitioners. Existing objective tools often require hardware, cloud uploads, or time-consuming processing.
Stride turns a simple smartphone video into fast, on-device equine movement analysis with Ultralytics YOLO, without internet access or special hardware.
As AI continues to support veterinary assessment and clinical decision-making, the demand for objective, accessible tools that can run in real-world field conditions is growing.
Dr Quentin Pleyers developed Stride, an iOS app that uses computer vision to deliver objective gait analysis for horses, helping veterinarians and equine professionals assess movement asymmetries directly from a video recording. By integrating Ultralytics YOLO models, Stride makes it possible to gather objective movement data on horses anywhere, in under a minute, without relying on cloud connectivity or specialized hardware.
Dr. Quentin Pleyers is an equine veterinarian based in southern Sweden, with a strong focus on sports medicine, biomechanics, and objective movement analysis. Combining his clinical background with a long-standing passion for software engineering, he built Stride to solve a problem he encountered every day in the field.
Stride was developed independently by Dr Quentin Pleyers. The project now benefits from academic and clinical collaborations with institutions and partners, including the University of Liège, the European Centre for Horse Studies, the University of Tennessee, and partners across Italy and Estonia. These collaborations are focused on validation studies, clinical research, and future applications, while the app itself was created and developed by Dr Quentin Pleyers.
Lameness is one of the most common and most difficult issues equine vets face. Traditionally, lameness assessment has relied on a vet's trained eye, with results varying between practitioners and proving especially challenging in cases of subtle or multi-leg lameness.
Objective tools do exist, including pressure plates and IMU sensors, but they often require dedicated hardware, controlled environments, and time-consuming data processing. Many computer vision-based systems require uploading large 4K video files to remote servers, a process that can take 10 to 15 minutes in the field, where bandwidth is often limited.
The missing piece was a tool that could deliver objective movement data quickly, on a device many veterinarians and equine professionals already carry in their pocket: a smartphone.
To make this possible, Dr Quentin Pleyers built Stride, leveraging Ultralytics YOLO26 models trained for pose estimation, exported to Core ML, and deployed natively on iOS. The app records video of a horse trotting, extracts key anatomical landmarks frame-by-frame, and analyzes the vertical displacement of points like the head, withers, and pelvis to quantify movement asymmetries.

Crucially, the entire pipeline, from video capture to pose detection to signal processing, runs locally on the device. On an iPhone 17 Pro, Stride processes a full gait analysis in around one minute, with inference times of approximately 10 milliseconds per frame using an Ultralytics YOLO26 medium model.
Stride was trained using thousands of manually annotated images of horses, captured across a wide range of breeds, coat colors, lighting conditions, and backgrounds. The training process was streamlined using Ultralytics Platform, which Dr Quentin Pleyers used to develop, iterate, and refine the model that powers the app today.
For Dr Quentin Pleyers, Ultralytics YOLO offered the right balance of performance, flexibility, and ease of use needed to bring an idea from prototype to production.

After exploring multiple computer vision frameworks, Dr Quentin Pleyers found that Ultralytics YOLO models offered both the accuracy required for biomechanical analysis and the lightweight efficiency needed to run smoothly on mobile devices. The ability to easily export to Core ML and deploy natively on iOS was a key factor in making Stride a fully offline, field-ready tool.
Stride is not designed to replace a vet's clinical judgment, and that distinction is important to note. The app does not diagnose lameness; it provides an objective measurement of asymmetry in a horse's vertical movement, giving veterinarians and equine professionals one more reliable data point to support their overall assessment.

This approach is helping Stride gain traction across the equine veterinary community, particularly among newer generations of vets who are comfortable integrating digital tools into their workflows. By delivering objective data in real-time, on a familiar device, Stride helps reduce variability in assessments and supports more confident, evidence-based clinical decisions.
Dr Quentin Pleyers is now expanding Stride to Android, with the goal of making objective equine movement analysis accessible to veterinarians, equine professionals, trainers, therapists, farriers, and horse owners worldwide. By helping users identify movement asymmetries as early as possible, Stride aims to support earlier intervention, better follow-up, and the highest possible standards of equine welfare. Continued collaboration with academic and clinical partners will further validate Stride's role in practice and explore new applications for objective movement analysis in equine medicine.
By combining decades of clinical expertise with cutting-edge computer vision, Stride represents a new chapter in how technology can support, rather than replace, the trained eye of a skilled veterinarian.
Interested in building Vision AI solutions of your own? Visit our GitHub repository to explore Ultralytics YOLO models, learn how YOLO is driving innovations across industries like AI in healthcare, and check out our licensing options to get started.
Ultralytics YOLO models are computer vision architectures developed to analyze visual data from images and video inputs. These models can be trained for tasks including Object detection, classification, pose estimation, tracking and instance segmentation.Ultralytics YOLO models include:
Ultralytics YOLO11 is the latest version of our Computer Vision models. Just like its previous versions, it supports all computer vision tasks that the Vision AI community has come to love about YOLOv8. The new YOLO11, however, comes with greater performance and accuracy, making it a powerful tool and the perfect ally for real-world industry challenges.
The model you choose to use depends on your specific project requirements. It's key to take into account factors like performance, accuracy, and deployment needs. Here's a quick overview:
Ultralytics YOLO repositories, such as YOLOv5 and YOLO11, are distributed under the AGPL-3.0 License by default. This OSI-approved license is designed for students, researchers, and enthusiasts, promoting open collaboration and requiring that any software using AGPL-3.0 components also be open-sourced. While this ensures transparency and fosters innovation, it may not align with commercial use cases.
If your project involves embedding Ultralytics software and AI models into commercial products or services and you wish to bypass the open-source requirements of AGPL-3.0, an Enterprise License is ideal.
Benefits of the Enterprise License include:
To ensure seamless integration and avoid AGPL-3.0 constraints, request an Ultralytics Enterprise License using the form provided. Our team will assist you in tailoring the license to your specific needs.
Begin your journey with the future of machine learning