A digital CMOS fully connected neural network with in-circuit learning capability and automatic identification of spurious attractors
Abstract
Summary form only given. An electronic implementation of a completely connected feedback network, containing 64 neurons, is considered. The technology is fully digital CMOS, with binary neurons and 9-bit-wide signed synaptic coefficients. The architecture trades off connectivity versus speed by implementing a linear systolic loop, in which each neuron locally stores its own synaptic coefficients. The authors have first implemented internal learning capabilities. They used the Widrow-Hoff rule, which converges towards the projection rule by iteration, thus allowing partial correlation between prototypes and a higher capacity compared to the Hebb rule. They have also implemented an internal mechanism for detecting relaxations on spurious states. The combination of these two properties gives the network a rather high degree of autonomy, making unnecessary the use of an external computer for tasks other than just writing or reading data and asserting simple control signals.
Keywords
Widrow-Hoff rule
automatic identification
connectivity
convergence
digital CMOS circuit
feedback
fully connected neural network
in-circuit learning capability
iteration
linear systolic loop
projection rule
relaxations
signed synaptic coefficients
speed
spurious attractors
CMOS integrated circuits
digital integrated circuits
learning systems
neural nets
relaxation