MIN Faculty
Department of Informatics
Knowledge Technology

Warning!

This webpage is outdated and has moved. Please find the official Knowledge Technology page at:

https://www.inf.uni-hamburg.de/en/inst/ab/wtm/

Books and Special Issues

Emergent Neural Computational Architectures based on Neuroscience

Stefan Wermter, Jim Austin, David Willshaw

March 2001, Springer, Heidelberg
ISBN: 3-540-42363-X, DM 118,00, 577 pp. Softcover

This book is the result of a series of International Workshops organised by the EmerNet project on Emergent Neural Computational Architectures based on Neuroscience sponsored by the Engineering and Physical Sciences Research Council (EPSRC). The overall aim of the book is to present a broad spectrum of current research into biologically inspired computational systems and hence encourage the emergence of new computational approaches based on neuroscience. It is generally understood that the present approaches for computing do not have the performance, flexibility and reliability of biological information processing systems. Although there is a massive body of knowledge regarding how processing occurs in the brain and central nervous system this has had little impact on mainstream computing so far.

The process of developing biologically inspired computerised systems involves the examination of the functionality and architecture of the brain with an emphasis on the information processing activities. Biologically inspired computerised systems address neural computation from the position of both neuroscience, and computing by using experimental evidence to create general neuroscience-inspired systems.

The book focuses on the main research areas of modular organisation and robustness, timing and synchronisation, and learning and memory storage. The issues considered as part of these include: How can the modularity in the brain be used to produce large scale computational architectures? How does the human memory manage to continue to operate despite failure of its components? How does the brain synchronise its processing? How does the brain compute with relatively slow computing elements but still achieve rapid and real-time performance? How can we build computational models of these processes and architectures? How can we design incremental learning algorithms and dynamic memory architectures? How can the natural information processing systems be exploited for artificial computational methods?

Top

Chapters

Towards Novel Neuroscience-inspired Computing (Abstract) Full Chapter (PS) Full Chapter (PDF)
Stefan Wermter, Jim Austin, David Willshaw and Mark Elshaw

Modular Organisation and Robustness

Images of the Mind: Brain Images and Neural Networks
John Taylor
Stimulus-Independent Data Analysis for fMRI
Silke Dodel, J. Michael Herrmann and Theo Geisel
Emergence of Modularity within One Sheet of Neurons: A Model Comparison
Cornelius Weber and Klaus Obermayer
Computational Investigation of Hemispheric Specialization and Interactions
James Reggia, Yuri Shkuro and Natalia Shevtsova
Explorations of the Interaction between Split Processing and Stimulus Types
John Hicks and Padraic Monaghan
Modularity and Specialized Learning: Mapping Between Agent Architectures and Brain Organization
Joanna Bryson and Lynne Andrea Stein
Biased Competition Mechanisms for Visual Attention in a Multimodular Neurodynamical System
Gustavo Deco
Recurrent Long-Range Interactions in Early Vision
Thorsten Hansen, Wolfgang Sepp and Heiko Neumann
Neural Mechanisms for Representing Surface and Contour Features
Thorsten Hansen and Heiko Neumann
Representations of Neuronal Models using Minimal and Bilinear Realisations
Gary Green, Will Woods and S. Manchanda
Collaborative Cell Assemblies: Building Blocks of Cortical Computation
Ronan Reilly
On the Influence of Threshold Variability in a Mean-field Model of the Visual Cortex
Hauke Bartsch, Martin Stetter and Klaus Obermayer
Towards Computational Neural Systems Through Developmental Evolution
Alistair Rust, Rod Adams, Stella George and Hamid Bolouri
The Complexity of the Brain: Structural, Functional and Dynamic Modules
Péter Érdi and Tam
ás Kiss

Timing and Synchronisation

Synchronisation, Binding and the Role of Correlated Firing in Fast Information Transmission
Simon Schultz, Huw Golledge and Stefano Panzeri
Segmenting State into Entities and its Implication for Learning
James Henderson
Temporal Structure of Neural Activity and Modelling of Information Processing in the Brain
Roman Borisyuk, Galina Borisyuk and Yakov Kazanovich
Role of the Cerebellum in Time-Critical Goal-Oriented Behaviour: Anatomical Basis and Control Principle
Guido Bugmann
Locust Olfaction Synchronous Oscillations in Excitatory and Inhibitory Groups of Spiking Neurons
David Sterratt
Temporal Coding in Neuronal Populations in the Presence of Axonal and Dendritic Conduction Time Delays
David Halliday
The Role of Brain Chaos
Péter Andr
ás
Neural Network Classification of Word Evoked Neuromagnetic Brain Activity
Ramin Assadollahi and Friedemann Pulverm
üller
Simulation Studies of the Speed of Recurrent Processing
Stefano Panzeri, Edmund Rolls, Francesco Battaglia and Ruth Lavis

Learning and Memory Storage

The Dynamics of Learning and Memory: Lessons from Neuroscience
Michael Denham
Biological Grounding of Recruitment Learning and Vicinal Algorithms in Long-term Potentiation
Lokendra Shastri
Plasticity and Nativism: Towards a Resolution of an Apparent Paradox
Gary Marcus
Cell Assemblies as an Intermediate Level Model of Cognition
Christian Huyck
Modelling Higher Cognitive Functions with Hebbian Cell Assemblies
Marcin Chady
Spiking Associative Memory and Scene Segmentation by Synchronization of Cortical Activity
Andreas Knoblauch and G
ünther Palm
A Familiarity Discrimination Algorithm Inspired by Computations of the Perirhinal Cortex
Rafal Bogacz, Malcolm Brown and Christophe Giraud-Carrier
Linguistic Computation with State Space Trajectories
Hermann Moisl
Robust Stimulus Encoding in Olfactory Processing: Hyperacuity and Efficient Signal Transmission
Tim Pearce, Paul Verschure, Joel White and John Kauer
Finite-State Computation in Analog Neural Networks: Steps Towards Biologically Plausible Models?
Mikel Forcada and Rafael Carrasco
An Investigation into the Role of Cortical Synaptic Depression in Auditory Processing
Sue Denham and Michael Denham
The Role of Memory, Anxiety and Hebbian Learning in Hippocampal Function: Novel Explorations in Computational Neuroscience and Robotics
John Kazer and Amanda Sharkey
Using a Time-Delay Actor-Critic Neural Architecture with Dopamine-like Reinforcement Signal for Learning in Autonomous Robots
Andrés Pérez-Uribe
Connectionist Propositional Logic A Simple Correlation Matrix Memory Based Reasoning System
Daniel Kustrin and Jim Austin
Analysis and Synthesis of Agents that Learn from Distributed Dynamic Data Sources
Doina Caragea, Adrian Silvescu and Vasant Honavar
Connectionist Neuroimaging
Stephen José Hanson, Michiro Negishi and Catherine Hanson

Emergent Neural Computational Architectures based on Neuroscience can be bought from Springer-Verlag using the booking form and accessed on-line using the appropriate login and password from Springer.


Top

Contact

Prof. Stefan Wermter
University of Hamburg
Department of Informatics, Knowledge Technology
Vogt Koelln Str. 30
22527 Hamburg
Germany

Phone: +49 40 428 83 2434
Fax: +49 40 428 83 2515
Secretary: +49 40 428 83 2433
Email: wermter at informatik dot uni-hamburg dot de