Neuromorphic computing hardware that requires conventional training procedures based on backpropagation is difficult to scale, because of the need for full observability of network states and for programmability of network parameters. Therefore, the search for hardware-friendly and biologically-plausible learning schemes, and suitable platforms, is pivotal for the future developments of the field. We present a novel experimental study of a photonic integrated neural network featuring rich recurrent nonlinear dynamics and both short- and long-term plasticity. Scalability in these architectures is greatly enhanced by the capability to process input and to generate output that are encoded concurrently in the temporal, spatial and wavelength domains. Moreover, we discuss a novel biologically-plausible, backpropagation-free and hardware-friendly learning procedure based on our neuromorphic hardware.
|