Growth rates [of functions]

One can characterize most functions by their ultimate rates of growth. In basic mathematics these might be n, 2n, 3n, ... or n^{2}, n^{3}, ..., or 2^{n}, 3^{n}, ..., or 2^{n}, 2^{2n}, 2^{2n}, ... To go further one begins by defining an analog to the Ackermann function of page 906:

f[1][n_] = 2n; f[s_][n_] := Nest[f[s - 1], 1, n]

f[2][n] is then 2^{n}, f[3] is iterated power, and so on. Given this one can now form the "diagonal" function

f[ω][n_]:= f[n][n]

and this has a higher growth rate than any of the f[s][n] with finite s. This higher growth rate is indicated by the transfinite index ω. And in direct analogy to the transfinite numbers discussed above one can then in principle form a hierarchy of functions using operations like

f[ω+s][n_]:=Nest[f[ω+s-1], 1, n]

together with diagonalization at limit ordinals. In practice, however, it gets more and more difficult to determine that the functions defined in this way actually in a sense halt and yield definite values—and indeed for f[ε_{0}] this can no longer be proved using the ordinary axioms of arithmetic (see below). Yet it is still possible to define functions with even more rapid rates of growth. An example is the so-called busy beaver function (see page 1144) that gives the maximum number of steps that it takes for any Turing machine of size n to halt when started from a blank tape. In general this function must grow faster than any computable function, and is not itself computable.