Synopsis
Through the first fifty years of the computer revolution, scientists have been trying to program electronic circuits to process information the same way humans do. Doing so has reassured us all that underlying every new computer capability, no matter how miraculously fast or complex, are human thought processes and logic. But cutting-edge computer scientists are coming to see that electronic circuits really are alien, that the difference between the human mind and computer capability is not merely one of degree (how fast), but of kind(how). The author suggests that computers “think” best when their “thoughts” are allowed to emerge from the interplay of millions of tiny operations all interacting with each other in parallel. Why then, if computers bring to the table such very different strengths and weaknesses, are we still trying to program them to think like humans? A work that ranges widely over the history of ideas from Galileo to Newton to Darwin yet is just as comfortable in the cutting-edge world of parallel processing that is at this very moment yielding a new form of intelligence, After Thought describes why the real computer age is just beginning.
Reviews
The true electronic revolution has not yet happened, proclaims Bailey. A new breed of computers is emerging, using parallel processing and new mathematics ("intermaths") with exotic names like cellular automata, genetic algorithms and artificial life, which enable computers to continually change their own programs as they compute. Instead of the traditional mathematical vocabulary of numbers, symbols and equations, these computers emphasize emergent patterns, enabling scientists to investigate a world of perpetual novelty. The new computers are being used to analyze the behavior of bird flocks and consumers, to study the human immune system, to make financial decisions and to contour the molecular structure of effective drugs. Freelancer Bailey, a former executive at Thinking Machines Corp., predicts that the new computers will create their own versions of scientific theories and help us fathom biological and cultural evolution as well as the workings of the mind. This is a thoughtful, exciting preview of the dawning age of computing.
Copyright 1996 Reed Business Information, Inc.
Computer-aided math is now at a point where unaided human intelligence cannot follow the proofs, a fact that has profound implications for future science, according to James (a former executive at Thinking Machines Corp.). He illustrates this thesis by summarizing the role of different forms of math in the history of science and philosophy. Ptolemy constructed his astronomical theory on a geometrical basis of perfect circles. But when astronomers (notably Tycho Brahe) began to collect data that failed to fit the theory, new mathematical tools became necessary to construct a more accurate model of the cosmos: first algebra, then calculus. Descartes's step-by-step sequential method was matched to the strengths of the human mind and gained its most impressive results from a miserly amount of data. But physical scientists came to scorn ``mere'' data collection. A true scientist worked to discover abstract theoretical principles; collecting data and doing arithmetic were the jobs of assistants. The earliest computers mimicked the methods of human calculators; their main advantages were increased speed and almost perfect accuracy. Advanced computers change all that, handling incredible floods of data with ridiculous ease--and in many cases, in parallel streams. It is no longer unthinkable to simply pile up huge quantities of fact and analyze the resulting patterns. The implications of this are most profound in disciplines to which the sequential maths were least adaptable: meteorology, biology, and economics, all of which generate enormous masses of seemingly chaotic data. The computers can analyze these data and discover patterns, even though the programmers can no longer follow their ``reasoning.'' What this finally means is that we humans will increasingly have to accept computers as equal partners in the enterprise of science--and to accept as valid computer-generated results we cannot begin to understand. A fascinating tour of scientific history, concluding with a vision of a future that is at once exhilarating and profoundly unsettling. -- Copyright ©1996, Kirkus Associates, LP. All rights reserved.
With his famous declaration, "I think, therefore I am," Descartes placed the human thinker at the very center of the philosophical universe. But Bailey questions whether the human thinker can long maintain this position in a computerized world. With their massive memory banks and parallel centers for data processing, computers can analyze complex systems without forcing reality to fit within the formulas essential to human reasoning. And as computers develop the capacity to program themselves, the gap between human intelligence and electronic intelligence can only widen. Consequently, Bailey predicts that the mental abilities of a Copernicus, a Newton, or a Descartes will grow increasingly irrelevant in the coming decades. Whether mapping the galaxies, balancing national budgets, or assessing the risks of war, humans will--Bailey believes--become ever more dependent upon a computer-chip logic alien to human understanding. Because it says profoundly unsettling things in a clear and cogent way, Bailey's book will quickly establish itself as an important reference work for anyone concerned about our cultural trajectory. Bryce Christensen
Bailey, a former executive at Thinking Machines, the manufacturer of one of the earliest lines of parallel processor computers, argues that computers using parallel processing, as opposed to traditional linear processing, will change the way we understand intelligence. Drawing on stories and examples from Galileo to contemporary thinkers, he seeks to explain why the parallel-processing approach will revolutionize information processing and analysis. Some of his examples and analogies are straightforward and understandable, but too often he makes unclear chronological and conceptual jumps. Part philosophy, part history of science, part computer science history, and part technological prediction, this book is difficult to follow and thus unconvincing. For academic libraries.?Hilary Burton, Lawrence Livermore National Lab., Livermore, Cal.
Copyright 1996 Reed Business Information, Inc.
"About this title" may belong to another edition of this title.