Back to indexPreviousNext

From: Stephen Wolfram, A New Kind of Science
Notes for Chapter 12: The Principle of Computational Equivalence
Section: Implications for Technology
Page 1193

Searching for technology. Many inventions are made by pure ingenuity (some× aided by mathematical calculation) or by mimicking processes that go on in nature. But there are also cases where systematic searches are done. Notable examples were the testing of thousands of materials as candidate electric light bulb filaments by Thomas Edison in 1879, and the testing of 606 substances for chemotherapy by Paul Ehrlich in 1910. For at least fifty years it has now been quite routine to test many hundreds or thousands of substances in looking, say, for catalysts or drugs with particular functions. (Other kinds of systematic searches done include ones for metal alloys, cooking recipes and plant hybrids.) Starting in the late 1980s the methods of combinatorial chemistry (see note below) began to make it possible to do biochemical tests on arrays of millions of related substances. And by the late 1990s, similar ideas were being used for example in materials science: in a typical case an array of different combinations of substances is made by successively spraying through an appropriate sequence of masks, with some physical or chemical test then applied to all the samples.

In the late 1950s maximal length shift register sequences (page 1089) and some error-correcting codes (page 1106) were found by systematic searches of possible polynomials. Most subsequent codes, however, have been found by explicit mathematical constructions. Optimal circuit blocks for operations such as addition and sorting (see page 1148) have occasionally been found by searches, but are more often found by explicit construction, progressive improvement or systematic logic minimization (see page 1102). In some compilers searches are occasionally done for optimal sequences of instructions to implement particular simple functions. And in recent years - notably in the building of Mathematica - optimal algorithms for operations such as function evaluation and numerical integration have some× been found through searches. In addition, my 1984 identification of rule 30 as a randomness generator was the result of a small-scale systematic search.

Particularly since the 1970s, many systematic methods have been tried for optimizing engineering designs by computer. Usually they are based on iterative improvement rather than systematic search. Some rely on linear programming or gradient descent. Others use methods such as simulated annealing, neural networks and genetic algorithms. But as discussed on page 342, except in very simple cases, the results are usually far from any absolute optimum. (Plant and animal breeding can be viewed as a simple form of randomized search done since the dawn of civilization.)

Stephen Wolfram, A New Kind of Science (Wolfram Media, 2002), page 1193.
© 2002, Stephen Wolfram, LLC