Presentation
28 September 2023 Nanoelectronic implementation of Equilibrium Propagation
Author Affiliations +
Abstract
As deep learning continues to grow, developing adapted energy-efficient hardware becomes crucial. Learning on a chip requires hardware-compatible learning algorithms and their realization with physically imperfect devices. Equilibrium Propagation is a training technique introduced in 2017 by Yoshua Bengio which gives gradient estimates based on a spatially local learning rule, making it both more biologically plausible and more hardware compatible than backpropagation. This work uses the Equilibrium Propagation algorithm to train a neural network with hardware-in-the-loop simulations using hafnium oxide memristor synapses. Realizing this type of learning with imperfect and noisy devices paves the way for on-chip learning at very low energy.
Conference Presentation
© (2023) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Marie Drouhin, Clément Turck, Kamel-Eddine Harabi, Adrien Renaudineau, Thomas Bersani-Veroni, Elisa Vianello, Jean-Michel Portal, Julie Grollier, and Damien Querlioz "Nanoelectronic implementation of Equilibrium Propagation", Proc. SPIE PC12655, Emerging Topics in Artificial Intelligence (ETAI) 2023, PC1265507 (28 September 2023); https://doi.org/10.1117/12.2677568
Advertisement
Advertisement
KEYWORDS
Nanoelectronics

Education and training

Neural networks

Analog electronics

Computer programming

Resistance

Spatial learning

Back to Top