Evolutionary Neural Architecture Search with Dual Contrastive Learning

Reading time: 1 minute
...

📝 Original Info

  • Title: Evolutionary Neural Architecture Search with Dual Contrastive Learning
  • ArXiv ID: 2512.20112
  • Date: 2025-12-23
  • Authors: Xian-Rong Zhang, Yue-Jiao Gong, Wei-Neng Chen, Jun Zhang

📝 Abstract

Evolutionary Neural Architecture Search (ENAS) has gained attention for automatically designing neural network architectures. Recent studies use a neural predictor to guide the process, but the high computational costs of gathering training data-since each label requires fully training an architecture-make achieving a high-precision predictor with limited compute budget (i.e., a capped number of fully trained architecture-label pairs) crucial for ENAS success. This paper introduces ENAS with Dual Contrastive Learning (DCL-ENAS), a novel method that employs two stages of contrastive learning to train the neural predictor. In the first stage, contrastive self-supervised learning is used to learn meaningful representations from neural architectures without requiring labels. In the second stage, fine-tuning with contrastive learning is performed to accurately predict the relative performance of different architectures rather than their absolute performance, which is sufficient to guide the evolutionary search. Across NASBench-101 and NASBench-201, DCL-ENAS achieves the highest validation accuracy, surpassing the strongest published baselines by 0.05% (ImageNet16-120) to 0.39% (NASBench-101). On a real-world ECG arrhythmia classif...

📄 Full Content

...(본문 내용이 길어 생략되었습니다. 사이트에서 전문을 확인해 주세요.)

Start searching

Enter keywords to search articles

↑↓
ESC
⌘K Shortcut