Yolo Vision Shenzhen
Shenzhen
Rejoindre maintenant
Glossaire

Époque

Découvrez les époques dans l'apprentissage automatique - leur impact sur la formation des modèles, la prévention de l'ajustement excessif et l'optimisation des performances avec Ultralytics YOLO.

An epoch represents one complete cycle through the entire training dataset by a machine learning algorithm. During this process, the model has the opportunity to update its internal parameters based on every sample in the data exactly once. In the context of deep learning, a single pass is rarely sufficient for a neural network to learn complex patterns effectively. Therefore, training typically involves multiple epochs, allowing the learning algorithm to iteratively refine its understanding and minimize the error between its predictions and the actual ground truth.

The Role of Epochs in Optimization

The primary goal of training is to adjust model weights to minimize a specific loss function. Optimization algorithms, such as stochastic gradient descent (SGD) or the Adam optimizer, use the error calculated during each epoch to guide these adjustments. As the number of epochs increases, the model generally shifts from a state of high error (random guessing) to lower error (learned patterns).

However, selecting the correct number of epochs is a critical aspect of hyperparameter tuning.

  • Too few epochs: This can lead to underfitting, where the model has not yet captured the underlying trend of the data.
  • Too many epochs: This often results in overfitting, where the model memorizes the training noise rather than generalizing to new data. To prevent this, developers often monitor performance on validation data and employ techniques like early stopping to halt training when generalization stops improving.

Époque, lot et itération

It is common for beginners to confuse "epoch" with related terms. Understanding the hierarchy of these concepts is essential for configuring training loops correctly:

  • Epoch: One complete pass through the full dataset.
  • Batch: A subset of the dataset processed simultaneously. Because datasets are often too large to fit into GPU memory all at once, they are divided into smaller groups defined by the batch size.
  • Iteration: A single update to the model's weights. If a dataset has 1,000 images and the batch size is 100, it will take 10 iterations to complete one epoch.

Applications concrètes

The number of epochs required varies drastically depending on the complexity of the task and the size of the data.

  • Medical Image Analysis: In medical image analysis, such as detecting tumors in MRI scans, accuracy is paramount. Models trained for these tasks often run for hundreds of epochs. This extensive training ensures the convolutional neural network (CNN) can discern subtle anomalies that distinguish malignant tissue from healthy tissue, potentially saving lives.
  • Autonomous Driving: For autonomous vehicles, object detection models must reliably identify pedestrians, signs, and other vehicles. Training these robust systems typically involves massive datasets like COCO or Objects365. While the dataset size is huge, the model still requires multiple epochs to converge on a solution that generalizes well to diverse weather and lighting conditions.

Managing Training Cycles with Code

When using modern frameworks like Ultralytics YOLO, defining the number of epochs is a straightforward argument in the training command. Tools like the Ultralytics Platform can help visualize the loss curves over each epoch to identify the optimal stopping point.

The following example demonstrates how to set the epoch count when training a YOLO26 model:

from ultralytics import YOLO

# Load the YOLO26n model (nano version for speed)
model = YOLO("yolo26n.pt")

# Train the model for 50 epochs
# The 'epochs' argument determines how many times the model sees the entire dataset
results = model.train(data="coco8.yaml", epochs=50, imgsz=640)

Dans cet extrait, l'élément epochs=50 argument instructs the training engine to cycle through the coco8.yaml dataset 50 distinct times. During each cycle, the model performs forward propagation et rétropropagation to refine its detection capabilities.

Rejoindre la communauté Ultralytics

Rejoignez le futur de l'IA. Connectez-vous, collaborez et évoluez avec des innovateurs mondiaux.

Rejoindre maintenant