Emergent Neural Computational Architectures based on Neuroscience (2001)

Stefan Wermter, Jim Austin, David Willshaw
March 2001, Springer, Heidelberg
ISBN: 3-540-42363-X, DM 118,00, 577 pp. Softcover
This book is the result of a series of International Workshops organised by the EmerNet project on Emergent Neural Computational Architectures based on Neuroscience sponsored by the Engineering and Physical Sciences Research Council (EPSRC). The overall aim of the book is to present a broad spectrum of current research into biologically inspired computational systems and hence encourage the emergence of new computational approaches based on neuroscience. It is generally understood that the present approaches for computing do not have the performance, flexibility and reliability of biological information processing systems. Although there is a massive body of knowledge regarding how processing occurs in the brain and central nervous system this has had little impact on mainstream computing so far.
The process of developing biologically inspired computerised systems involves the examination of the functionality and architecture of the brain with an emphasis on the information processing activities. Biologically inspired computerised systems address neural computation from the position of both neuroscience, and computing by using experimental evidence to create general neuroscience-inspired systems.
The book focuses on the main research areas of modular organisation and robustness, timing and synchronisation, and learning and memory storage. The issues considered as part of these include: How can the modularity in the brain be used to produce large scale computational architectures? How does the human memory manage to continue to operate despite failure of its components? How does the brain synchronise its processing? How does the brain compute with relatively slow computing elements but still achieve rapid and real-time performance? How can we build computational models of these processes and architectures? How can we design incremental learning algorithms and dynamic memory architectures? How can the natural information processing systems be exploited for artificial computational methods?
Click here to download the introduction chapter of the book.