**is a book by Stephen Wolfram, published in 2002. It has two primary purposes: 1) To promote the use of simple abstract systems to supplement and replace traditional mathematical equations in making models of natural systems, and 2) To introduce the empirical study of what simple abstract systems - essentially simple computer programs - actually do. Additionally, there is a less emphasized third track, which is what implications discoveries in the first two have for general thought.**

*A New Kind of Science*
Notable for its publicity and the number of negative reviews in reputable journals, ** A New Kind of Science** is a large, complex work that tries to simultaneously introduce a large number of unfamiliar and intertwined ideas in as non-technical a format as possible.

To a certain degree, the book can be understood in three parts: Chapters 2-7 introduce a large number of qualitatively different systems and try to address some basic questions about such systems. Chapters 8 and 9 try to show that these systems are not isolated from the natural world, but in fact essential for understanding it. Chapters 10, 11, and 12 address our understanding of the world, cutting across semiotics, linguistics, philosophy, as well as more formalized structures such as mathematics and computer science. They build the theoretical underpinnings that explain in fundamental way the discoveries and observations of the earlier chapters.

This new kind of science is essentially the study of simple computer programs. Because of undecidability, it is impossible to predict what they will do before actually running them. Wolfram terms this inability to shortcut the program, or otherwise describe its behavior in a simple way, "computational irreducibility". The empirical fact is that the world of simple programs contains a great diversity of behavior, and Wolfram advocates that it be explored both for its own sake and because of the connections to other fields.

Wolfram originally sought to develop a framework for understanding complexity in nature. His conclusion was that the complex behavior of a wide range of systems could be investigated using a single framework by equating the evolution of a system to a computation, and the complexity of a system to its computational sophistication.

Simple programs generate great complexity so they are ideal for being an abstract basis for the study of complexity because they are in a sense minimal examples of the behavior of interest. To relate the complexity in abstract systems to that in the natural world, one must realize that because of computational irreducibility, the details of the underlying system's setup bear little relation to its final behavior. A concrete analogy is that air and water have much different underlying components, but at the large scale exhibit much of the same dynamics. Likewise, the same kinds of phenomenon occur no matter if the components are bits or molecules.

Therefore, Wolfram argues, that it is possible to study the abstract world of simple programs, and take lessons from what kinds of things occur there and have them in mind when investigating natural systems.

The first lesson is that because computational irreducibility is common in systems simple enough to be found in nature, we cannot rely on methods that seek to summarize behavior - such as traditional mathematics. One cannot simply write down an equation or extract a number and with it capture behavior of great complexity.

Additionally, it is far more difficult to go from the behavior of a system to its underlying rule than the other way around. With the technology and knowledge we now have, we should instead try to search all possible rules of a given type and see if we can find what we are looking for rather than constructing it.

Finally, when developing a class of models to search, it is important to build in as little structure as possible, instead of trying to build in the known behavior a priori. That behavior may possibly emerge in unexpected ways from a simpler structure containing fewer assumptions, and will be more likely to reproduce aspects of the behavior that are not yet known or designed for.

To confirm the power of his method, Wolfram presents a series of models of various physical and biological systems that are noted for their complex forms, and shows how the models lead to various predictions, sometimes in contradiction to existing theories. For instance, his leaf and shell models suggest that complex morphological forms are essentially sampled at random from the space of possibilities, instead of being finely crafted by the forces of natural selection.

Wolfram goes on to propose building a fundamental theory of physics on a radically new foundation that obliterates current notions of space, time, and matter. The proposal is that the universe is a big network of nodes, and evolution consists of graph rewriting. All that is essential about the universe is the interconnection pattern between the nodes; familiar things like space and matter emerge to those inside the network as a large-scale effect on top of local randomness.

The final third of the book is a buildup to the Principle of Computational Equivalence. It states that "almost all processes that are not obviously simple can be view as computations of equivalent sophistication." It asserts that there is an ultimate level of computational sophistication that is reached far more easily and frequently than is typically assumed. Among the major implications is an explanation for why things often appear complex: There is a competition between the observer and observed, and in many cases the two are equal in computational sophistication. In this case the observer cannot extract a brief description of the process, and it appears complex.

Wolfram steps through a surprising number of implications, many at the foundations of fields. At times he seems to admit that the statement of the principle is somewhat vague, and essentially says that to understand and believe it requires a new intuition built by the study of simple programs.

## Criticism

The book has attracted several types of criticism.

The first is plagiarism: it has been claimed that Wolfram did not really present any new ideas, but essentially just elaborated on Konrad Zuse's old book on computable universes and cellular automata (Calculating Space, 1969, translated by MIT in 1970), without giving proper credit to Zuse, and actually misrepresenting Zuse's contributions in a self-aggrandizing way. Similarly, Wolfram's section on Turing machine-computable physics was criticised as a rehash of a paper by Juergen Schmidhuber, published in 1997. In particular, his graph rewriting systems can be viewed as being subsumed by Schmidhuber's simple program

that computes all computable universe histories. Wolfram also was accused of largely ignoring and improperly referencing the prior work of Edward Fredkin on reversible digital physics (Fredkin published several papers on this topic after Zuse visited his lab at MIT, and published a widely known and freely downloadable book draft in 2001). Wolfram was also criticised for downplaying the role of Matthew Cook, who apparently proved what was called the book's most interesting though not earth-shaking result (about a particular cellular automaton).

The second type of criticism comes from people who cannot even accept the book's basic premise, namely, that real-world patterns may be the result of the execution of very simple programs. Therefore these critics implicitly also criticise the earlier work by Zuse, Fredkin, Schmidhuber, Petrov, and others.

The third type of criticism addresses the vagueness of some of the concepts in the book, such as Wolfram's allegedly fuzzy notion of randomness, which contrasts well-known mathematical definitions of randomness such as Kolmogorov complexity. Similarly, Wolfram's principle of computational equivalence was criticised for its alleged lack of novelty and mathematical rigor.

There have been other criticisms as well which are less crucial than the above. One category is that Wolfram has bypassed entirely the usual scientific practice of critical review, i.e. he published his book before having had it reviewed by peers in the field. Less important is that Wolfram invents new terms for old concepts rather than using terminology already extant in the field. Another category of criticism is that Wolfram comes close to making what many view as outrageous claims, such as that his work will make obsolete much of existing scientific and mathematical methods. These are commonly characteristics of pseudoscience or crank science, though few if any critics have dared to call Wolfram a crank in prominent publications. Physicist Freeman Dyson has said of Wolfram's ANKOS, "There's a tradition of scientists approaching senility to come up with grand, improbable theories. Wolfram is unusual in that he's doing this in his 40s."

## External links

- Official site
- Ed Clark's review page: Collection of reviews of NKS
- Book review by Steven G. Krantz for the American Mathematical Society
- Amazon's book reviews, ranked by votes