# FOM: Feasibility and (in)determinateness of the standard numbers

Tue Mar 31 16:00:10 EST 1998

```Martin Davis wrote:

> The view I expressed on fom is the only problem I have
> with the cumulative hierarchy is its length.

It seems that if this length increases then the "length" of the
"standard model" of arithmetic will increase too. The stronger
are axioms the more we will have examples of concrete Turing
machines which could be (feasibly) proved to halt and therefore
the more we will have corresponding "provably recursive/existing
natural numbers". (Examples of such comparatively small numbers
provably existing in PRA or even in much weaker systems of
arithmetic:  2^1000, 2^{2^1000}), etc.) Therefore the "length"
of the "standard model" seems to be rather indeterminate.

By the way, the following questions arise. Is feasibly provable
ordering m <_fp n between (feasibly) provably existing natural
numbers m and n a *linear* ordering? (More precisely, m <_fp n
means that there exists a proof of feasible length in a version
of arithmetic of the statement m < n.) Can we always provably
decide equality m=n between provably existing numbers?  In
particular, is the position of the number 2^1000 fully
determinate among all provably existing numbers.  In other
words, what is the "proper value" of the number 2^1000?  "How
large" is it? (Compare: How large is the cardinal 2^{aleph_0}?)

Have these questions any meaning?

Torkel Franzen wrote:
>

>    >Changing (or rejection)
>    >of some of these rules may result in a different notion of natural
>    >numbers with different understanding and intuition.
>
>   This points up a general problem with the sort of revisionism you
> are arguing for. Your general ideas and aims are quite intelligible on
> the basis of our ordinary understanding of arithmetic. You suggest (in
> your paper) that this "ordinary understanding" is, on closer
> inspection, contrary to basic intuition and experience, and that it
> could be replaced, as stated above. However, you present these views
> using a logical apparatus steeped in the tradition that you describe
> as contrary to basic intuition and experience,

Yes, I really use some traditional proof theoretic finitary
considerations involving infeasible numbers like 2^1000 in the
ordinary way with the goal to find *any* reasonable guarantee
that some simple formal version of feasible arithmetic (FA,
called also FEAS) is *almost* (or feasibly) consistent. If you
see now that FA is indeed feasibly consistent then the goal was
achieved.  This is somewhat analogous to using non-finitary
theory ZFC or induction up to epsilon_0 for proving the
consistency of "finitary" theory PA.

On the other hand, there exists *no* model of FA represented in
ZFC universe. (Here an illusion may arise of a contradiction
with G"odel completeness theorem for predicate calculus. But it
is not the case!) Nevertheless, we have an *informal model* of
feasible numbers which is therefore very different from the
ordinary first order models. This is the place where the basic
intuition and experience does not work or is insufficient and
should be somewhat corrected or extended. Also note that the
formalization of FA has rather unusual features. Say,
implication is in general non-transitive, and even the feasible
number 1000 behaves as a kind of infinity. (Moreover, there is
yet another, even more strange peculiarity of FA).

> The view that the traditional conception can
> or should be dispensed with, on the other hand, is a kind of
> revisionary enthusiasm difficult to substantiate.

My "enthusiasm" has rather different character.  Nothing that
was achieved by a heavy work "should be dispensed with".
However, what about the ordinary intention of mathematicians to
embed "everything" in a unique mathematical "universe" like that
for ZFC?  I only say that feasibility concept taken
"foundationally" is not embeddable straightforwardly in the
ordinary mathematics as we know it.  This seems to touch on some
illusions we have. If you, Torkel, can fully realize this
However, I have some doubts on this especially in connection
with the very unusual intended "model" for FA.

>   >What makes the powerset 2^N of natural numbers (i.e. the set of
>   >infinite binary strings) to be indeterminate *in contrast to*
>   >the powerset 2^1000={0,1}^1000 of {1,2,...1000} which should be
>   >determinate (according to the traditional view and *contrary* to
>   >my intuition)?
>
>   Answers to this question tend to boil down, one way or another, to
> the statement that 2^N is not generated by a rule.

Which rule then generates 2^1000 (as a set of binary strings)?
Of course, enumerating all these strings in the lexicographical
order is irrelevant as an infeasible and highly non-realistic
process.