Sei sulla pagina 1di 377

TLFeBOOK

Neural Engineering

TLFeBOOK

Computational Neuroscience Terrence J. Sejnowski and Tomaso A. Poggio, editors

Neural Nets in Electric Fish, Walter Heiligenberg, 1991 The Computational Brain, Patricia S. Churchland and Terrence J. Sejnowski, 1992 Dynamic Biological Networks: The Stomatogastric Nervous System, edited by Ronald M. Harris-Warrick, Eve Marder, Allen I. Selverston, and Maurice Moulins, 1992 The Neurobiology of Neural Networks, edited by Daniel Gardner, 1993

Large-Scale Neuronal Theories of the Brain, edited by Christof Koch and Joel L. Davis,

1994

The Theoretical Foundations of Dendritic Function: Selected Papers of Wilfrid Rall with Commentaries, edited by Idan Segev, John Rinzel, and Gordon M. Shepherd, 1995 Models of Information Processing in the Basal Ganglia, edited by James C. Houk, Joel L. Davis, and David G. Beiser, 1995 Spikes: Exploring the Neural Code, Fred Rieke, David Warland, Rob de Ruyter van Steveninck, and William Bialek, 1997 Neurons, Networks, and Motor Behavior, edited by Paul S. Stein, Sten Grillner, Allen I. Selverston, and Douglas G. Stuart, 1997 Methods in Neuronal Modeling: From Ions to Networks, second edition, edited by Christof Koch and Idan Segev, 1998 Fundamentals of Neural Network Modeling: Neuropsychology and Cognitive Neuro- science , edited by Randolph W. Parks, Daniel S. Levine, and Debra L. Long, 1998 Neural Codes and Distributed Representations: Foundations of Neural Computation, edited by Laurence Abbott and Terrence J. Sejnowski, 1999 Unsupervised Learning: Foundations of Neural Computation, edited by Geoffrey Hinton and Terrence J. Sejnowski, 1999 Fast Oscillations in Cortical Circuits, Roger D. Traub, John G. R. Jefferys, and Miles A. Whittington, 1999 Computational Vision: Information Processing in Perception and Visual Behavior, Hanspeter A. Mallot, 2000 Graphical Models: Foundations of Neural Computation, edited by Michael I. Jordan and Terrence J. Sejnowski, 2001 Self-Organizing Map Formation: Foundations of Neural Computation, edited by Klaus Obermayer and Terrence J. Sejnowski, 2001 Neural Engineering: Computation, Representation, and Dynamics in Neurobiological Sys- tems, Chris Eliasmith and Charles H. Anderson, 2003

TLFeBOOK

Neural Engineering

Computation, Representation, and Dynamics in Neurobiological Systems

Chris Eliasmith and Charles H. Anderson

A Bradford Book The MIT Press Cambridge, Massachusetts London, England

TLFeBOOK

c 2003 Massachusetts Institute of Technology All rights reserved. No part of this book may be reproduced in any form by any electronic or mechanical means (including photocopying, recording, or information storage and retrieval) without permission in writing from the publisher.

This book was typeset in Times by the authors using L Y X and L A T E X and was printed and bound in the United States of America.

Library of Congress Cataloging-in-Publication Data

Eliasmith, Chris. Neural engineering : computation, representation, and dynamics in neurobiological systems / Chris Eliasmith and Charles H. Anderson. p. cm. – (Computational neuroscience) “A Bradford book.” Includes bibliographical references and index. ISBN 0-262-05071-4 (hc.) 1. Neural networks (Neurobiology) 2. Neural networks (Computer science) 3. Computational neuroscience. I. Anderson, Charles H. II. Title. III. Series.

QP363.3 .E454 2002

573.8’5–dc21

10 9 8 7 6 5 4 3 2 1

2002070166

TLFeBOOK

To Jen, Alana, Alex, and Charlie

and

To David Van Essen

TLFeBOOK

This page intentionally left blank

TLFeBOOK

Contents

 

Preface

 

xiii

Using this book as a course text

xvii

Acknowledgments

xix

1

Of neurons and engineers

 

1

1.1

Explaining neural systems

 

3

1.2

Neural representation

5

1.2.1 The single neuron

 

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

9

1.2.2 Beyond the single neuron

 

.

.

.

.

.

.

.

.

.

.

.

.

.

11

1.3 Neural transformation

 

13

1.4 Three principles of neural engineering

 

15

 

1.4.1 Principle 1

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

16

1.4.2 Principle 2

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

17

1.4.3 Principle 3

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

18

1.4.4 Addendum

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

18

1.5

Methodology

19

1.5.1 System description

 

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

19

1.5.2 Design specification

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

21

1.5.3 Implementation .

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

21

1.5.4 Discussion

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

22

1.6

A possible theory of neurobiological systems

 

23

I

REPRESENTATION

2

Representation in populations of neurons

 

29

2.1

Representing scalar magnitudes

 

30

2.1.1 Engineered representation

 

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

30

2.1.2 Biological representation

 

.

.

.

.

.

.

.

.

.

.

.

.

.

33

2.2 Noise and precision

 

40

 

2.2.1 Noisy neurons

 

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

40

2.2.2 Biological representation and noise

.

.

.

.

.

.

.

.

.

.

.

42

2.3 An example: Horizontal eye position

 

44

 

2.3.1 System description

 

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

44

2.3.2 Design specification

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

46

2.3.3 Implementation .

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

47

TLFeBOOK

viii

Contents

 

2.3.4

Discussion

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

48

2.4 Representing vectors

 

49

2.5 An example: Arm movements

 

52

 

2.5.1 System description

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

53

2.5.2 Design specification

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

54

2.5.3 Implementation

 

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

55

2.5.4 Discussion

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

55

2.6

An example: Semicircular canals

 

57

2.6.1 System description

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

57

2.6.2 Implementation

 

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

58

2.7

Summary

59

3

Extending population representation

 

61

3.1

A representational hierarchy

 

61

3.2

Function representation

 

63

3.3

Function spaces and vector spaces

 

69

3.4

An example: Working memory

 

72

3.4.1 System description

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

73

3.4.2 Design specification

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

74

3.4.3 Implementation

 

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

77

3.4.4 Discussion

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

78

3.5

Summary

79

4

Temporal representation in spiking neurons

 

81

4.1

The leaky integrate-and-fire (LIF) neuron

 

81

4.1.1 Introduction

 

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

81

4.1.2 Characterizing the LIF neuron

.

.

.

.

.

.

.

.

.

.

.

.

.

.

83

4.1.3 Strengths and weaknesses of the LIF neuron model

.

.

.

88

4.2

Temporal codes in neurons

 

89

4.3

Decoding neural spikes

 

92

4.3.1 Introduction .

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

92

4.3.2 Neuron pairs

 

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

94

4.3.3 Representing time dependent signals with spikes

.

.

.

.

96

4.3.4 Discussion

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

103

TLFeBOOK

Contents

ix

4.4

Information transmission in LIF neurons

 

105

4.4.1 Finding optimal decoders in LIF neurons

 

.

.

.

.

.

.

.

.

105

4.4.2 Information transmission

 

.

.

.

.

.

.

.

.

.

.

.

.

.

109

4.4.3 Discussion

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

114

4.5

More complex single neuron models

 

115

4.5.1 Adapting LIF neuron

 

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

116

4.5.2 -neuron

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

118

4.5.3 Adapting, conductance-based neuron

.

.

.

.

.

.

.

.

.

.

123

4.5.4 Discussion

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

126

4.6

Summary

127

5

Population-temporal representation

 

129

5.1

Putting time and populations together again

 

129

5.2

Noise and precision: Dealing with distortions

132

5.3

An example: Eye position revisited

 

136

5.3.1 Implementation .

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

136

5.3.2 Discussion

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

137

5.4

Summary

139

II

TRANSFORMATION

 

6

Feed-forward transformations

 

143

6.1

Linear transformations of scalars

 

143

6.1.1 A communication channel

 

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

143

6.1.2 Adding two variables

 

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

148

6.2 Linear transformations of vectors

 

151

6.3 Nonlinear transformations

 

153

 

6.3.1

Multiplying two variables

 

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

154

6.4 Negative weights and neural inhibition

 

160

 

6.4.1 Analysis

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

161

6.4.2 Discussion

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

166

6.5 An example: The vestibular system

 

168

 

6.5.1 System description

 

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

169

6.5.2 Design specification

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

174

TLFeBOOK

x

Contents

 

6.5.3 Implementation

 

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

175

6.5.4 Discussion

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

180

6.6

Summary

182

7

Analyzing representation and transformation

 

185

7.1

Basis vectors and basis functions

 

185

7.2

Decomposing

192

7.3

Determining possible transformations

 

196

7.3.1 Linear tuning curves

 

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

200

7.3.2 Gaussian tuning curves

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

204

7.4

Quantifying representation

 

206

7.4.1 Representational capacity

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

206

7.4.2 Useful representation

 

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

208

7.5

The importance of diversity

 

210

7.6

Summary

216

8

Dynamic transformations

 

219

8.1 Control theory and neural models

 

221

 

8.1.1 Introduction to control theory

 

.

.

.

.

.

.

.

.

.

.

.

.

.

.

221

8.1.2 A control theoretic description of neural populations .

.

222

8.1.3 Revisiting levels of analysis

 

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

225

8.1.4 Three principles of neural engineering quantified

 

230

8.2 An example: Controlling eye position

 

232

 

8.2.1 Implementation

 

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

233

8.2.2 Discussion

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

240

8.3 An example: Working memory

 

244

 

8.3.1 Introduction .

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

244

8.3.2 Implementation

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

244

8.3.2.1 Dynamics of the vector representation

.

.

.

.

244

8.3.2.2 Simulation results

 

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

245

8.3.3 Discussion

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

248

8.4 Attractor networks

 

250

 

8.4.1 Introduction

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

250

8.4.2 Generalizing representation

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

254

8.4.3 Generalizing dynamics

 

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

256

TLFeBOOK

Contents

xi

 

8.4.4

Discussion

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

258

8.5

An example: Lamprey locomotion

 

260

8.5.1

Introduction

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

260

8.5.2

System description

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

261

8.5.3

Design specification

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

264

8.5.4

Implementation .

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

265

8.5.5

Discussion

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

271

8.6

Summary

273

9

Statistical inference and learning

 

275

9.1

Statistical inference and neurobiological systems

 

275

9.2

An example: Interpreting ambiguous input

 

281

9.3

An example: Parameter estimation

 

283

9.4

An example: Kalman filtering

 

287

9.4.1

Two versions of the Kalman filter

.

.

.

.

.

.

.

.

.

.

.

.

288

9.4.2

Discussion

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

291

9.5

Learning

293

9.5.1

Learning a communication channel

 

.

.

.

.

.

.

.