What makes people smarter than computers? These volumes by a pioneeringneurocomputing group suggest that the answer lies in the massively parallel architecture of thehuman mind. They describe a new theory of cognition called connectionism that is challenging theidea of symbolic computation that has traditionally been at the center of debate in theoreticaldiscussions about the mind.
The authors' theory assumes the mind is composed of agreat number of elementary units connected in a neural network. Mental processes are interactionsbetween these units which excite and inhibit each other in parallel rather than sequentialoperations. In this context, knowledge can no longer be thought of as stored in localizedstructures; instead, it consists of the connections between pairs of units that are distributedthroughout the network.
Volume 1 lays the foundations of this exciting theory ofparallel distributed processing, while Volume 2 applies it to a number of specific issues incognitive science and neuroscience, with chapters describing models of aspects of perception,memory, language, and thought.
This two-volume work is now considered a classic in the field. It presents the results of the Parallel Distributed Processing (PDP) group's work in the early 1980s and provides a good overview of the earlier neural network research. The PDP approach (also known as connectionism among other things) is based on the conviction that various aspects of cognitive activity are thought of in terms of massively parallel processing. The first volume starts with the general framework and continues with an analysis of learning mechanisms and various mathematical and computational tools important in the analysis of neural networks. The chapter on backpropagation is written by Rumelhart, Hinton, and Williams, who codiscovered the algorithm in 1986. The second volume is written with a psychological and biological emphasis. It explores the relationship of PDP to various aspects of human cognition. The book is a comprehensive research survey of its time and most of the book's results and methods are still at the foundation of the neural network field.