A 24.1 TOPS/W mixed-signal BNN processor in 28-nm CMOS

A mixed-signal binarized neural network (BNN) processor for the MNIST image classification is demonstrated based on analogue circuit networks. BNN algorithm for training neural networks with binary weights and activations reduces power consumption and memory size. This algorithm is a design to perfo...

Full description

Saved in:
Bibliographic Details
Published in:International journal of electronics Vol. 111; no. 8; pp. 1288 - 1300
Main Authors: Kim, Hanseul, Park, Jongmin, Lee, Hyunbae, Yang, Hyeokjoon, Burm, Jinwook
Format: Journal Article
Language:English
Published: Abingdon Taylor & Francis 02-08-2024
Taylor & Francis LLC
Subjects:
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:A mixed-signal binarized neural network (BNN) processor for the MNIST image classification is demonstrated based on analogue circuit networks. BNN algorithm for training neural networks with binary weights and activations reduces power consumption and memory size. This algorithm is a design to perform core operations of multi-layer perceptron (MLP) to reduce complexity and power consumption using analogue circuits. The mixed-signal BNN processor employs a current mirror neuron to perform multiply-and-accumulate (MAC) operations and sign activation functions. The near-threshold current mirror neuron that computes the key operations of the BNN algorithm is used to achieve low power consumption. The design occupies 0.065 mm 2 in 28-nm CMOS with 560B of on-chip SRAM. The 28-nm CMOS test-chip achieves energy efficiency of 24.1 TOPS/W and 94% accuracy on the MNIST image classification. This design with binary weights and activations exhibits only 5% degradation in accuracy compared with the models with floating-point precision.
ISSN:0020-7217
1362-3060
DOI:10.1080/00207217.2023.2224070