[FOM] Re: Fractional Iteration and Rates of Growth

Dmytro Taranovsky dmytro at MIT.EDU
Sat May 6 18:40:20 EDT 2006


In the previous posting, I defined a function f meant to represent
natural primitive recursive rates of growth:
f(1, 1, x) = 2*x
f(n, a, x) = ath iterate of f(n, 1, x)
f(n+1, 1, x) = f(n, x, x)
with f(0, a, x) = x + a, with f analytic (using analytic continuation
along the real axis), and with f(0)=0 and f'(0)>0 for n>0 (n is an
integer) and any real a.

At integral arguments, f is just an analogue of the Ackermann function.
For all real a and b, f(n, a+b, x) = f(n, a, f(n, b, x)) (x>0).  f is
analytic in both x and a.  f(1, a, x) = 2^a*x.

Joe Shipman asked:  How do you know that the resulting power series [for
f] have nonzero radii of convergence?

Local existence and uniqueness are guaranteed by general results on
fractional iteration.  However, in general (and for f(3, 1, x)), the
series will not have an infinite radius of convergence, hence the need
for analytic continuation.

A remarkable feature of natural asymptotic rates of growth is that they
are comparable:  Either one rate is higher or both rates are equal.
Because of that, the naturalness of f is a rather strong (if imprecise)
claim.  
Note, however, that the rate of growth proper should be distinguished
from random fluctuations and from oscillatory behavior.  For example,
g(x) = 2^x/2^1 + 2^2^x/2^2^2 + 2^2^2^x/2^2^2^3 + ... grows unevenly and
with quasi-period 1.  This is because f(3, -1, g(x)) (should) look like
y = ceiling(x-1) for large x.

Dmytro Taranovsky


More information about the FOM mailing list