Yolo فيجن شنتشن
شنتشن
انضم الآن
مسرد المصطلحات

البحث عن التصميم العصبي (NAS)

Learn how Neural Architecture Search (NAS) automates the design of high-performance neural networks. Explore search strategies and optimized models like YOLO26.

Neural Architecture Search (NAS) is a sophisticated technique within the realm of Automated Machine Learning (AutoML) that automates the design of artificial neural networks. Traditionally, engineering high-performance deep learning (DL) architectures required extensive human expertise, intuition, and time-consuming trial-and-error. NAS replaces this manual process with algorithmic strategies that systematically explore a vast range of network topologies to discover the optimal structure for a specific task. By testing various combinations of layers and operations, NAS can identify architectures that significantly outperform human-designed models in terms of accuracy, computational efficiency, or inference speed.

Core Mechanisms of NAS

The process of discovering a superior architecture generally involves three fundamental dimensions that interact to find the best neural network (NN):

  1. Search Space: This defines the set of all possible architectures the algorithm can explore. It acts like a library of building blocks, such as convolution filters, pooling layers, and various activation functions. A well-defined search space constrains complexity to ensure the search remains computationally feasible while allowing enough flexibility for innovation.
  2. Search Strategy: Instead of testing every possibility (brute force), NAS employs intelligent algorithms to navigate the search space efficiently. Common approaches include reinforcement learning, where an agent learns to generate better architectures over time, and evolutionary algorithms, which mutate and combine top-performing models to breed superior candidates.
  3. Performance Estimation Strategy: Training every candidate network from scratch is prohibitively slow. To accelerate this, NAS uses estimation techniques—such as training on fewer epochs, using lower-resolution proxy datasets, or employing weight sharing—to quickly rank the potential of a candidate architecture.

تطبيقات واقعية

NAS has become critical in industries where hardware constraints or performance requirements are strict, pushing the boundaries of computer vision (CV) and other AI domains.

  • Efficient Edge Computing: Deploying AI on mobile devices requires models that are both lightweight and fast. NAS is extensively used to discover architectures like MobileNetV3 and EfficientNet that minimize inference latency while maintaining high precision. This is vital for edge AI applications, such as real-time video analytics on smart cameras or autonomous drones.
  • Medical Imaging: In medical image analysis, accuracy is paramount. NAS can tailor networks to detect subtle anomalies in X-rays or MRI scans, often finding novel feature extraction pathways that human engineers might overlook. This leads to more reliable tools for identifying conditions like brain tumors or fractures with higher sensitivity.

NAS مقابل المفاهيم ذات الصلة

لفهم الدور المحدد لـ NAS، من المفيد تمييزه عن تقنيات التحسين المماثلة:

  • NAS مقابل ضبط المعلمات الفائقة: في حين أن كلاهما ينطوي على التحسين، يركز ضبط المعلمات الفائقة على تعديل إعدادات التكوين (مثل معدل التعلم أو حجم الدفعة) لهيكل ثابت. في المقابل، يغير NAS الهيكل الأساسي للنموذج نفسه، مثل عدد الطبقات أو كيفية توصيل الخلايا العصبية.
  • NAS مقابل التعلم النقلي: يأخذ التعلم النقلي نموذجًا موجودًا ومدربًا مسبقًا ويكيف أوزانه مع مهمة جديدة. NAS ينشئ بنية النموذج من الصفر (أو يبحث عن هيكل أساسي أفضل) قبل بدء التدريب.

استخدام النماذج المستمدة من NAS

في حين أن إجراء بحث NAS كامل يتطلب مواردGPU كبيرة ، يمكن للمطورين استخدام النماذج التي تم إنشاؤها عبر NAS بسهولة. على سبيل المثال، تم اكتشاف بنية YOLO باستخدام مبادئ البحث هذه لتحسين مهام الكشف عن الكائنات.

يوضح Python التالي Python كيفية تحميل واستخدام نموذج NAS تم البحث عنه مسبقًا باستخدام ultralytics الحزمة:

from ultralytics import NAS

# Load a pre-trained YOLO-NAS model (architecture found via NAS)
# 'yolo_nas_s.pt' refers to the small version of the model
model = NAS("yolo_nas_s.pt")

# Run inference on an image to detect objects
# This utilizes the optimized architecture for fast detection
results = model("https://ultralytics.com/images/bus.jpg")

# Print the top detected class
print(f"Detected: {results[0].names[int(results[0].boxes.cls[0])]}")

For those looking to train state-of-the-art models without the complexity of NAS, the Ultralytics YOLO26 offers a highly optimized architecture out of the box, incorporating the latest advancements in research. You can easily manage datasets, training, and deployment for these models using the Ultralytics Platform, which simplifies the entire MLOps lifecycle.

انضم إلى مجتمع Ultralytics

انضم إلى مستقبل الذكاء الاصطناعي. تواصل وتعاون وانمو مع المبتكرين العالميين

انضم الآن