Yolo Vision Shenzhen
Шэньчжэнь
Присоединиться сейчас
Глоссарий

Настройка гиперпараметров

Освойте настройку гиперпараметров для оптимизации ML-моделей, таких как Ultralytics YOLO. Повысьте точность, скорость и производительность с помощью экспертных методов.

Hyperparameter tuning is the iterative process of optimizing the external configuration variables that govern the training process of a machine learning (ML) model. Unlike internal parameters—such as weights and biases which are learned from data during training—hyperparameters are set by the data scientist or engineer before the learning process begins. These settings control the model's structure and the algorithm's behavior, acting as the "knobs and dials" that fine-tune performance. Finding the ideal combination of these values is critical for maximizing metrics like accuracy and efficiency, often making the difference between a mediocre model and a state-of-the-art solution.

Core Concepts and Techniques

The collection of all possible hyperparameter combinations creates a high-dimensional search space. Practitioners use various strategies to navigate this space to find the optimal configuration that minimizes the loss function.

  • Grid Search: This exhaustive method evaluates the model for every specified combination of parameters in a grid. While thorough, it is computationally expensive and suffers from the curse of dimensionality when dealing with many variables.
  • Random Search: Instead of testing every combination, this technique selects random combinations of hyperparameters. Research suggests this is often more efficient than grid search, as it explores the search space more effectively for the most impactful parameters.
  • Bayesian Optimization: This probabilistic approach builds a surrogate model to predict which hyperparameters will yield the best results based on past evaluations, focusing the search on the most promising areas.
  • Evolutionary Algorithms: Inspired by biological evolution, this method uses mechanisms like mutation and crossover to evolve a population of configurations over generations. This is the primary method used by the ultralytics library to optimize modern architectures like YOLO26.

Настройка гиперпараметров по сравнению с обучением модели

It is essential to distinguish between tuning and training, as they represent distinct phases in the MLOps lifecycle:

Применение в реальном мире

Effectively tuned models are critical for deploying robust solutions in complex environments.

Точное земледелие

In AI in Agriculture, autonomous drones use computer vision to identify weeds and crop diseases. These models often run on edge devices with limited battery life. Engineers utilize hyperparameter tuning to optimize the data augmentation pipeline and input resolution, ensuring the model balances high inference speeds with the precision needed to spray only the weeds, reducing chemical usage.

Медицинская диагностика

For AI in Healthcare, specifically in medical image analysis, a false negative can have severe consequences. When training models to detect anomalies in MRI scans, practitioners aggressively tune hyperparameters related to class weighting and focal loss. This tuning maximizes recall, ensuring that even subtle signs of pathology are flagged for human review, significantly aiding early diagnosis.

Автоматизированная настройка с помощью Ultralytics

Сайт ultralytics library simplifies optimization by including a built-in tuner that utilizes genetic algorithms. This allows users to automatically search for the best hyperparameters for their custom datasets without manual trial-and-error. For large-scale operations, teams can leverage the Платформа Ultralytics to manage datasets and visualize these tuning experiments in the cloud.

The following example demonstrates how to initiate hyperparameter tuning for a YOLO26 model. The tuner will mutate hyperparameters over several iterations to maximize Mean Average Precision (mAP).

from ultralytics import YOLO

# Initialize a YOLO26 model (using the 'nano' weight for speed)
model = YOLO("yolo26n.pt")

# Start tuning hyperparameters on the COCO8 dataset
# The tuner runs for 30 epochs per iteration, evolving parameters like lr0 and momentum
model.tune(data="coco8.yaml", epochs=30, iterations=100, optimizer="AdamW", plots=False)

By automating this process, developers can move closer to the concept of Automated Machine Learning (AutoML), where the system self-optimizes to achieve the best possible performance for a specific task.

Присоединяйтесь к сообществу Ultralytics

Присоединяйтесь к будущему ИИ. Общайтесь, сотрудничайте и развивайтесь вместе с мировыми новаторами

Присоединиться сейчас