Bounded versus unbounded minimization

615 views
Skip to first unread message

David

unread,
Feb 2, 2009, 10:10:41 PM2/2/09
to Computability and Logic
I'd like to develop a better understanding of the reasons why adding
the minimization operator [Mn] enlarges the set of functions beyond
the set of functions generated by composition [Cn] and primitive
recursion [Pr] (Chapter 6).

Last week, we discussed the fact that p.r. functions are necessarily
total since the basic functions are total, and totality is preserved
by composition and primitive recursion. On the other hand, recursive
functions may be partial since, for some argument x, there may be no y
for which f(x,y)=0 (in the Mn operator), in which case f is not
regular and the resulting function Mn[f] is partial (not defined for
some x). OK, so some of the functions generated through Mn are
partial, which is already outside the scope of p.r. functions.

However, that is not the whole story, since some recursive but non-
p.r. functions are total. One example discussed in Chapter 7 is the
Ackermann function. So my question remains: what does the Mn operation
bring to the table above and beyond what Cn and Pr can do, in the
context of total functions?

Nick mentioned the contrast between bounded and unbounded loops, which
he had read about somewhere else. But I believe that he was really
talking about the difference between bounded (terminating) and
infinite loops, where an infinite loop would occur when the input x is
not in the domain of the (partial) function.

However, this argument only explains the difference between total and
partial functions. I'd like to dig deeper into what the fundamental
difference is between p.r. and recursive non-p.r. TOTAL functions.

The fact that the Min (bounded minimization, Section 7.1) operator
generates p.r. functions while the Mn (unbounded minimization)
operator generates non-p.r. functions leads me to believe that the
contrast between bounded and unbounded computations is actually
crucial in this matter, even among total functions.

Did the discussion of the Ackermann function in section 7.3 help you
with this issue?

Mike

unread,
Feb 2, 2009, 11:42:29 PM2/2/09
to Computability and Logic
Copy of an earlier email on this subject from Nick:

Dr. Furcy, we were talking about the minimization function and what
its purpose was. First of all here are some alternate descriptions of
the minimization function, some of which talk about the while/for loop
distinction i mentioned - http://www.earlham.edu/~peters/courses/logsys/recursiv.htm
http://en.wikipedia.org/wiki/Primitive_recursive_function#Computer_language_definition

This is starting to make sense to me - we can't do unbounded loops
with only composition and primitive recursion b/c composition is
necessarily finite and primitive recursion "counts down" as it were,
to the base case. In other words, when we use primitive recursion we
must specify a base case (usually when y = 0) and a "demoting case",
that is, a case that starts with y' and ends with y (when y != 0).
This clearly describes a bounded loop - yet in order to compute all
the functions that turing machines can compute, we must be allowed to
execute unbounded loops... in a sense, we need to be able to ask
questions that may or may not have any answer (we need to be able to
run our program forever, since we know not where the answer lies - f
(0), f(1), f(2), f(3), etc..)

This is what minimization seems to get us - since it asks the question
- "At what point does the function f return 0 for some inputs x1, ...,
xn?" - even though the function may *never* return 0 for that input.
Does that make as much sense as I think it does?

The quintessential example of a function which is not definable
without minimization is the Ackermann function -
http://en.wikipedia.org/wiki/Ackermann_function and also in the book
pg.84. An outline of the proof that it is not primitive recursive
follows, and isn't too bad. It doesn't explicitly make use of
minimization tho... It states only that because you can demonstrate
that the ackermann function grows faster than any primitive recursive
function, it must not be primitive recursive - how does that use
minimization tho?

The only thing i can think of right now has to do with the definition
- the function A(m,n) = n+1 when m = 0 - Is that the application of
minimization? It seems to be going from 2 inputs to one output when
one of the inputs (m) is 0... is that right?

David

unread,
Mar 3, 2009, 10:57:35 AM3/3/09
to Computability and Logic
Yes, this thread is still alive! I have implemented the basic p.r.
functions, as well as the substitution, recursion, and minimization
operators in ML. I am hoping to go all the way to the Ackermann
function. But I am not committing to any time frame. 8^) Hey, spring
break is coming up...

On a related front, thanks to Mike, I now have an English translation
of Ackermann's paper. Also, Mike sent us a link to a 1948 paper by
Raphael Robinson in which he proves that the function is not p.r.
since it grows faster than any of the p.r. functions. In this paper,
he refers to an earlier paper of his from 1947 on PRF. I now also have
a copy of that one, which I plan to read first.

Nick, I just gave hard copies of these papers to Mike. You can pick up
yours when you get back. By the way, do you know when you'll be back
in town?

Nick Peterson

unread,
Mar 3, 2009, 7:30:38 PM3/3/09
to computabili...@googlegroups.com

Hey guys,

That's great - I have no idea when I'll be back, could be anywhere from 2-6 months. Ill probably have a better idea by the end of the month, however. Ill keep you in touch.

Nick


--- On Tue, 3/3/09, David <fur...@uwosh.edu> wrote:
Reply all
Reply to author
Forward
0 new messages