On the construction of artificial brains

Saved in:
Bibliographic Details
Corporate Authors: SpringerLink (Online service)
Group Author: Ramacher Ulrich. 1949-; Malsburg Christoph von der, 1942-
Published: Springer,
Publisher Address: Berlin
Publication Dates: c2010.
Literature type: Book
Language: English
Subjects:
Online Access: http://dx.doi.org/10.1007/978-3-642-00189-5
Carrier Form: 1 online resource (viii, 359 p.): ill.
ISBN: 9783642001895 (electronic bk.)
3642001890 (electronic bk.)
Index Number: TP183
CLC: TP183
Contents: Includes bibliographical references and index.
Cover -- Contents -- Prologue -- 0.1 Main Results -- 0.2 Prehistory of Our Project -- 0.3 Acknowledgement -- 1 The Difficulty of Modelling Artificial Brains -- 1.1 McCullogh-Pitts Model -- 1.2 Learning Nets -- 1.3 Spiking Neurons -- 1.4 Architecture of Vision -- 1.5 The Steps of the Construction Process -- 1.6 Summary -- 2 Information Processing in Nets with Constant Synapses -- 2.1 Generic Signal Equations for Pulse Neurons and Synapses -- 2.2 Partitions and Their Time Development -- 2.3 Experiments with Constant Synapses -- 2.4 Entropy and Transfer Function of a Net -- 2.5 Operating Range of a Net -- 2.6 Pulse Rates -- 2.7 Resolution and Net Size -- 2.8 Application Potential -- 2.9 Limited Simulation Time -- 2.10 Summary -- 3 Theory of Nets with Constant or Dynamic Synapses -- 3.1 Derivation of the Signal Energy -- 3.2 Temporal Mean and Spatial Mean -- 3.3 Determination of the Frequency Distribution -- 3.4 Summary -- 4 Macro-Dynamics of Nets with Constant Synapses -- 4.1 Known Synapses -- 4.2 Known Distribution of Synapses -- 4.3 Agreement of Theory with Experiment -- 4.4 Lack of Correlation -- 4.5 Determining the Signal Energy and Entropy by Pulse Rates -- 4.6 Summary -- 5 Information Processing with Dynamic Synapses -- 5.1 The Types of Solutions of Synaptic Equations -- 5.2 Synchronisation of Neurons -- 5.3 Segmentation per Synchronisation -- 5.4 Calculation of Pulse Differences and Sums -- 5.5 Simple Applications -- 5.6 Time Coding and Correlation -- 5.7 Entropy and State Space -- 5.8 Preliminary Considerations on the Statistics of Synchronisation -- 5.9 Summary -- 6 Nets for Feature Detection -- 6.1 Overview of Visual System -- 6.2 Simple Cells -- 6.3 Creation of Detector Profiles for Gabor Wavelets -- 6.4 Experimental Check -- 6.5 Summary -- 7 Nets for Feature Recognition -- 7.1 Principles of Object Recognition -- 7.2 Net Architecture for Robust Feature Recognition -- 7.3 Feature Recogniser -- 7.4 Selectivity -- 7.5 Orthogonality of Rotation -- 7.6 Invariance of Function as to Brightness -- 7.7 Invariance of Function as to Form and Mimic -- 7.8 Generating Object Components through Binding of Features -- 7.9 Summary -- 8 Nets for Robust Head Detection -- 8.1 Results of Head Detection -- 8.2 Next Steps -- 8.3 Summary -- 9 Extensions of the Vision Architecture -- 9.1 Distance-Invariant Feature Pyramid -- 9.2 The Inner Screen -- 9.3 Summary -- 10 Look-out -- 10.1 Data Format of the Brain -- 10.2 Self-Organisation -- 10.3 Learning -- 10.4 Invariant Object Recognition -- 10.5 Structured Memory Domains -- 10.6 Summary -- 11 Preliminary Considerations on the Microelectronic Implementation -- 11.1 Equivalent Representations -- 11.2 Microelectronic Implementations -- 11.3 Models of Neurons and Synapses -- 12 Elementary Circuits for Neurons, Synapses, and Photosensors -- 12.1 Neuron -- 12.2 Adaptive Synapses -- 12.3 Photosensors -- 12.4 DA-Converters and Analogue Image Storage -- 12.5 Summary -- 13 Simulation of Microelectronic Neural Circuits and Systems -- 13.1 Modelling of Neurons and Synapses -- 13.2 Results of Modelling -- 13.3 Notes on the Simulation Procedure -- 13.4 Summary -- 14 Architecture and Chip Design of the Feature Recognizer -- 14.1 Chip Architecture of the Feature Recogni