Yuto Ueno


Session

06-19
14:30
15min
Demonstration of High-Speed Operation of ReLU output circuit for superconducting neural networks
Yuto Ueno

Artificial Neural Networks (ANNs) are computational models inspired by the human brain and play a crucial role in processing large-scale data in artificial intelligence. To enhance energy efficiency and improve operational speed, ANN hardware implementations have been actively developed. This study focuses on the design of a single-flux quantum (SFQ) neuron circuit incorporating a Rectified Linear Unit (ReLU) activation function for superconducting ANN hardware.
The ReLU activation function is particularly well-suited for ANNs, as it mitigates the gradient loss problem in large-scale neural networks. The proposed circuit primarily consists of an SFQ resettable delay flip-flop (RDFF). When the input data frequency exceeds the reset input frequency, the RDFF outputs SFQ pulses at a frequency corresponding to the difference between the data and reset input frequencies. Conversely, when the input data frequency is lower than the reset input frequency, no output is generated. Therefore, the ReLU-shaped output frequency as a function of the input data frequency is obtained by using this circuit.
The circuit was fabricated using the 10 kA/cm² Nb High-Speed Standard Process at AIST and evaluated at 4.2 K. Experimental results confirmed the correct operation of the ReLU activation function up to an input frequency of 40 GHz.

Neuromorphic
Room "Berlin & Oslo"