Abstract

Recent successes in deep learning for vision and natural language processing are attributed to larger models but come with energy consumption and scalability issues. Current training of digital deep-learning models primarily relies on backpropagation that is unsuitable for physical implementation. In this work, we propose a simple deep neural network architecture augmented by a physical local learning (PhyLL) algorithm, which enables supervised and unsupervised training of deep physical neural networks without detailed knowledge of the nonlinear physical layer’s properties. We trained diverse wave-based physical neural networks in vowel and image classification experiments, showcasing the universality of our approach. Our method shows advantages over other hardware-aware training schemes by improving training speed, enhancing robustness, and reducing power consumption by eliminating the need for system modeling and thus decreasing digital computation.

Details