Search NKS | Online

No single axiom can reproduce all equivalences, since such an axiom must of the form expr  a , yet expr cannot contain variables other than a , and so cannot for example reproduce a ∘ b  b ∘ a .
Using the results of this section it then follows that in the evolution of all 2D cellular automata of the type discussed on page 170 there exist purely repetitive configurations that remain unchanged.
In general, if a period m is possible then so must all periods n for which p = {m, n} satisfies OrderedQ[Transpose[If[MemberQ[p/#, 1], Map[Reverse, {p/#, #}], {#, p/#}]] &[2^IntegerExponent[p, 2]]] Extensions of this to other types of systems seem difficult to find, but it is conceivable that when viewed as continuous mappings on a Cantor set (see page 869 ) at least some cellular automata might exhibit similar properties.
Finding models [of systems] Even though a model may have a simple form, it may not be at all easy to find.
But particularly in making contact with existing physics it is almost inevitable that all sorts of technical formalism will be needed—and to maintain balance in this book I have not included this here.
Among the 2 32 k = 2 , r = 2 rules 428 do, and of these 2 are symmetric, and 6 are reversible, and all these are just shift and identity rules.
Other conserved quantities The conserved quantities discussed so far can all be thought of as taking values assigned to blocks of different kinds in a given state and then just adding them up as ordinary numbers.
Given an original DNF list s , this can be done using PI[s, n] : PI[s_, n_] := Union[Flatten[ FixedPointList[f[Last[#], n] &, {{}, s}] 〚 All, 1 〛 , 1]] g[a_, b_] := With[{i = Position[Transpose[{a, b}], {0,1}]}, If[Length[i]  1 && Delete[a, i] === Delete[b, i], {ReplacePart[a, _, i]}, {}]] f[s_, n_] := With[ {w = Flatten[Apply[Outer[g, #1, #2, 1] &, Partition[Table[ Select[s, Count[#, 1]  i &], {i, 0, n}], 2, 1], {1}], 3]}, {Complement[s, w, SameTest  MatchQ], w}] The minimal DNF then consists of a collection of these prime implicants. Sometimes it is all of them, but increasingly often when n ≥ 3 it is only some. … Given the original list s and the complete prime implicant list p the so-called Quine–McCluskey procedure can be used to find a minimal list of prime implicants, and thus a minimal DNF: QM[s_, p_] := First[Sort[Map[p 〚 # 〛 &, h[{}, Range[Length[s]], Outer[MatchQ, s, p, 1]]]]] h[i_, r_, t_] := Flatten[Map[h[Join[i, r 〚 # 〛 ], Drop[r, #], Delete[Drop[t, {}, #], Position[t 〚 All, # 〛 ], {True}]]] &, First[Sort[Position[#, True] &, t]]]], 1] h[i_, _, {}] := {i} The number of steps required in this procedure can increase exponentially with the length of p .
If one considers all 2 n possible sequences (say of 0's and 1's) of length n then it is straightforward to see that most of them must be more or less algorithmically random. For in order to have enough programs to generate all 2 n sequences most of the programs one uses must themselves be close to length n . … But even though one knows that almost all long sequences must be algorithmically random, it turns out to be undecidable in general whether any particular sequence is algorithmically random.
In 1909 Emile Borel had formulated the notion of normal numbers (see page 912 ) whose infinite digit sequences contain all blocks with equal frequency. … To disallow procedures say specially set up to pick out all the infinite number of 1's in a sequence Alonzo Church in 1940 suggested that only procedures corresponding to finite computations be considered. … And unlike earlier proposals the consequences of this definition seemed to show remarkable consistency (in 1966 for example Per Martin-Löf proved that in effect it covered all possible statistical tests)—so that by the early 1990s it had become generally accepted as the appropriate ultimate definition of randomness.
1 ... 59606162 ...