Analogue Imprecision In Mlp Training, Progress In Neural Processing, Vol 4
Peter Edwards (-),Alan F Murray (Univ Of Edinburgh, Uk)
Analogue Imprecision In Mlp Training, Progress In Neural Processing, Vol 4
Peter Edwards (-),Alan F Murray (Univ Of Edinburgh, Uk)
Hardware inaccuracy and imprecision are important considerations when implementing neural algorithms. This book presents a study of synaptic weight noise as a typical fault model for analogue VLSI realisations of MLP neural networks and examines the implication for learning and network performance. The aim of the book is to present a study of how including an imprecision model into a learning scheme as a fault tolerance hint can aid understanding of accuracy and precision requirements for a particular implementation. In addition the study shows how such a scheme can give rise to significant performance enhancement.
This item is not currently in-stock. It can be ordered online and is expected to ship in approx 4 weeks
Our stock data is updated periodically, and availability may change throughout the day for in-demand items. Please call the relevant shop for the most current stock information. Prices are subject to change without notice.
Sign in or become a Readings Member to add this title to a wishlist.