Copy of an earlier email on this subject from Nick:
Dr. Furcy, we were talking about the minimization function and what
its purpose was. First of all here are some alternate descriptions of
the minimization function, some of which talk about the while/for loop
distinction i mentioned -
http://www.earlham.edu/~peters/courses/logsys/recursiv.htm
http://en.wikipedia.org/wiki/Primitive_recursive_function#Computer_language_definition
This is starting to make sense to me - we can't do unbounded loops
with only composition and primitive recursion b/c composition is
necessarily finite and primitive recursion "counts down" as it were,
to the base case. In other words, when we use primitive recursion we
must specify a base case (usually when y = 0) and a "demoting case",
that is, a case that starts with y' and ends with y (when y != 0).
This clearly describes a bounded loop - yet in order to compute all
the functions that turing machines can compute, we must be allowed to
execute unbounded loops... in a sense, we need to be able to ask
questions that may or may not have any answer (we need to be able to
run our program forever, since we know not where the answer lies - f
(0), f(1), f(2), f(3), etc..)
This is what minimization seems to get us - since it asks the question
- "At what point does the function f return 0 for some inputs x1, ...,
xn?" - even though the function may *never* return 0 for that input.
Does that make as much sense as I think it does?
The quintessential example of a function which is not definable
without minimization is the Ackermann function -
http://en.wikipedia.org/wiki/Ackermann_function and also in the book
pg.84. An outline of the proof that it is not primitive recursive
follows, and isn't too bad. It doesn't explicitly make use of
minimization tho... It states only that because you can demonstrate
that the ackermann function grows faster than any primitive recursive
function, it must not be primitive recursive - how does that use
minimization tho?
The only thing i can think of right now has to do with the definition
- the function A(m,n) = n+1 when m = 0 - Is that the application of
minimization? It seems to be going from 2 inputs to one output when
one of the inputs (m) is 0... is that right?