Découvrez les époques dans l'apprentissage automatique - leur impact sur la formation des modèles, la prévention de l'ajustement excessif et l'optimisation des performances avec Ultralytics YOLO.
An epoch represents one complete cycle through the entire training dataset by a machine learning algorithm. During this process, the model has the opportunity to update its internal parameters based on every sample in the data exactly once. In the context of deep learning, a single pass is rarely sufficient for a neural network to learn complex patterns effectively. Therefore, training typically involves multiple epochs, allowing the learning algorithm to iteratively refine its understanding and minimize the error between its predictions and the actual ground truth.
The primary goal of training is to adjust model weights to minimize a specific loss function. Optimization algorithms, such as stochastic gradient descent (SGD) or the Adam optimizer, use the error calculated during each epoch to guide these adjustments. As the number of epochs increases, the model generally shifts from a state of high error (random guessing) to lower error (learned patterns).
However, selecting the correct number of epochs is a critical aspect of hyperparameter tuning.
It is common for beginners to confuse "epoch" with related terms. Understanding the hierarchy of these concepts is essential for configuring training loops correctly:
The number of epochs required varies drastically depending on the complexity of the task and the size of the data.
When using modern frameworks like Ultralytics YOLO, defining the number of epochs is a straightforward argument in the training command. Tools like the Ultralytics Platform can help visualize the loss curves over each epoch to identify the optimal stopping point.
The following example demonstrates how to set the epoch count when training a YOLO26 model:
from ultralytics import YOLO
# Load the YOLO26n model (nano version for speed)
model = YOLO("yolo26n.pt")
# Train the model for 50 epochs
# The 'epochs' argument determines how many times the model sees the entire dataset
results = model.train(data="coco8.yaml", epochs=50, imgsz=640)
Dans cet extrait, l'élément epochs=50 argument instructs the training engine to cycle through the
coco8.yaml dataset 50 distinct times. During each cycle, the model performs
forward propagation et
rétropropagation to refine its detection
capabilities.