Fault-tolerant training of neural networks in the presence of MOS transistor mismatches
Loading...
Date
2001
Authors
Öğrenci, Arif Selçuk
Dündar, Günhan
Balkır, Sina
Journal Title
Journal ISSN
Volume Title
Publisher
IEEE-INST Electrical Electronics Engineers Inc
Open Access Color
OpenAIRE Downloads
OpenAIRE Views
Abstract
Analog techniques are desirable for hardware implementation of neural networks due to their numerous advantages such as small size low power and high speed. However these advantages are often offset by the difficulty in the training of analog neural network circuitry. In particular training of the circuitry by software based on hardware models is impaired by statistical variations in the integrated circuit production process resulting in performance degradation. In this paper a new paradigm of noise injection during training for the reduction of this degradation is presented. The variations at the outputs of analog neural network circuitry are modeled based on the transistor-level mismatches occurring between identically designed transistors Those variations are used as additive noise during training to increase the fault tolerance of the trained neural network. The results of this paradigm are confirmed via numerical experiments and physical measurements and are shown to be superior to the case of adding random noise during training.
Description
Keywords
Backpropagation, Neural network hardware, Neural network training, Transistor mismatch
Turkish CoHE Thesis Center URL
Fields of Science
Citation
10
WoS Q
Q2
Scopus Q
Q1
Source
Volume
48
Issue
3
Start Page
272
End Page
281