Yolo 深圳
深セン
今すぐ参加
用語集

進化的アルゴリズム

Explore how evolutionary algorithms use natural selection to optimize AI. Learn to automate [hyperparameter tuning](https://www.ultralytics.com/glossary/hyperparameter-tuning) for [YOLO26](https://docs.ultralytics.com/models/yolo26/) to boost model performance.

Evolutionary Algorithms (EAs) are a powerful family of optimization algorithms that emulate the biological principles of natural selection and genetics to solve complex computational problems. Unlike traditional mathematical techniques that rely on calculus-based derivatives, such as stochastic gradient descent (SGD), EAs are designed to navigate vast, rugged, or poorly understood search spaces. They operate by maintaining a population of potential solutions that compete, reproduce, and mutate over time. This approach makes them particularly effective for tasks in artificial intelligence (AI) where the "best" solution is difficult to determine analytically, allowing systems to iteratively evolve towards an optimal outcome.

Biological Inspiration and Core Mechanisms

The functionality of an Evolutionary Algorithm is grounded in the concept of survival of the fittest. The process moves through a cycle of operators designed to mimic natural genetic evolution, gradually refining candidate solutions:

  1. Initialization: The system generates an initial population of random candidates. In the context of machine learning (ML), these candidates might represent different sets of model parameters.
  2. Fitness Evaluation: Each candidate is tested against a specific objective, known as a fitness function. For a computer vision (CV) model, this function usually evaluates metrics like accuracy or Mean Average Precision (mAP).
  3. Selection: Candidates with higher fitness scores are probabilistically selected to act as parents, ensuring that successful traits are preserved for the next generation.
  4. Reproduction and Variation: New solutions are created through crossover (recombining traits from two parents) and mutation (introducing random changes). This introduction of genetic diversity is critical, as it prevents the algorithm from stagnating in a local optimum, helping it explore the search space for the global maximum.

AIの実世界での応用

Evolutionary Algorithms are versatile and have been successfully applied to various domains within deep learning (DL) and engineering.

Automated Hyperparameter Tuning

One of the most practical applications of EAs is ハイパーパラメータチューニング. Modern neural networks require configuring dozens of parameters—such as learning rate, weight decay, and momentum—that significantly impact performance. EAs can automate this tedious trial-and-error process by evolving the configuration settings. For example, the tune() method in the Ultralytics library uses a genetic algorithm to discover the best training hyperparameters for YOLO26 models on custom datasets.

ニューラルアーキテクチャ探索(NAS)

EAs are a cornerstone of Neural Architecture Search (NAS). Instead of human engineers manually designing the structure of a neural network (NN), an evolutionary algorithm can "grow" the architecture. It tests different combinations of layers, neurons, and connections, evolving efficient structures that balance speed and accuracy. This technique has led to the creation of highly efficient backbones, such as EfficientNet, which are optimized for specific hardware constraints.

Evolutionary Algorithms vs. Swarm Intelligence

While both are nature-inspired optimization strategies, it is helpful to distinguish EAs from Swarm Intelligence (SI).

  • Evolutionary Algorithms: Rely on generational change. Individuals (solutions) live, reproduce based on fitness, and die, being replaced by their offspring. The primary drivers are genetic operators like mutation and crossover.
  • Swarm Intelligence: Mimics social interaction within a group, such as a flock of birds or a school of fish. Algorithms like Particle Swarm Optimization (PSO) involve a population of agents that move through the search space and adjust their positions based on their own experience and the success of their neighbors, without generational replacement.

Implementing Optimization with Ultralytics

Practitioners can leverage genetic algorithms directly to optimize their object detection models. The Ultralytics tune method runs an evolutionary process to mutate hyperparameters over several generations, automatically identifying the settings that yield the highest performance on your validation data.

from ultralytics import YOLO

# Load the standard YOLO26 model
model = YOLO("yolo26n.pt")

# Run hyperparameter tuning using a genetic algorithm approach
# The tuner evolves parameters (lr, momentum, etc.) over 30 generations
model.tune(data="coco8.yaml", epochs=10, iterations=30, plots=False)

This automated refinement allows developers to move beyond manual guessing. For teams scaling their operations, managing these experiments and tracking the evolution of model performance can be streamlined using the Ultralytics Platform, which visualizes training metrics and facilitates model deployment.

Ultralytics コミュニティに参加する

AIの未来を共に切り開きましょう。グローバルなイノベーターと繋がり、協力し、成長を。

今すぐ参加