# [FOM] Indispensability of the natural numbers

Wed May 19 10:42:41 EDT 2004

```Timothy Y. Chow wrote:
> On Tue, 18 May 2004, Vladimir Sazonov wrote:
>
>>Whatever are our beliefs, the mental concept of the natural numbers
>>is something vague, anyway. It is only illusion of something solid.
>
>
> Do you agree that our mental concepts of symbols and rules are also vague
> illusions of something solid?

We cannot directly observe schoolchildren
> applying *rules*; we only observe them doing specific things like making
> marks on paper, and it requires an act of mental abstraction to interpret
> our observations of children by saying, "Ah!  These children are applying
> *rules*!"  A rule cannot be weighed on a scale or poked with a thermometer.
> mental abstraction from "1, 2, 3, ..." to the natural numbers.

As soon as we want to discuss on formal rules *in general* with
using phrasing like "and so on" or three dots "..." and, especially,
if we will try to theorize on this then, I agree, we come to something
sufficiently vague because it is rather unclear what this "and so on"
means. This may be understood in different ways. If we assume that
there is *no last element* in the sequence of strings |, ||, |||, ...
(which may be considered as representations of natural numbers),
then this is consistent (in a formal *theory of feasible numbers*
based on some modification of classical first order logic) with the
axiom \forall n log log n < 10 (where 10 abbreviates ||||||||||).
Thus, intuitively, 2^{2^10} = 2^{1024} = infinity, that is the
corresponding computation does not halt according to such a theory
of natural numbers. There is some good reason to believe that this
theory is consistent in the sense that nobody will be able to formally
deduce, say, |=|| (however there are even much more unusual theorems,
than the above axiom, formally deducible in this theory and which may
look as a contradiction, but formally are not). Note, that this theory
(with the above axiom) has no model (in the framework of ZFC), although
it corresponds to some intuition on natural numbers and on a restricted
version of "and so on".

There is another, traditional formalization of natural numbers (PA)
where the negation of the sentence \forall n log log n < 10 is
derivable.

Both theories formalize the concept "and so on". Which of them is
the right formalization of this concept? You may have some preference,
but the fact is that there are various understanding of "and so on"
and there is no *standard* concept of the row of natural numbers.
PA formalizes a "longer" row than the above theory of feasible numbers.
(Bounded Arithmetic formalizes a row of an "intermediate" length:
there is a concrete Turing Machine computing an exponential tower
of the hight 2^1000 which is not provably halting in BA, but halting
in PA.) Second-order arithmetics formalizes even "longer" row than PA,
etc. What is the "real lengths" of N? Does this N exist at all or is it
only illusion of existing and of something solid only because we are
using sufficiently solid (see below) formalisms describing N? If N is
considered as an idealization, as a "longest possible" N which
could be considered as a "standard" N, I have no idea how to understand
this and what could I do with this "longest possible" in any rational
way. Only to use this as a ritual of magic? Could we axiomatize this
"longest possible"? If not, why to appeal to this highly unclear
idealization at all? Each time we are using a formalization of "natural
numbers" we have some (possibly vague) idea about N, and that is all.
No standard N can be explicated. Only inside strong theories like ZFC
we can do that, say via the least infinite ordinal omega. But from
the "outside" point of view this omega still remains vague.

This is about numbers or formal rules *in general*. On the other hand,
when we are doing something (deduce theorems, check mathematical proofs
or just walking) we need not simultaneously think how are we doing
that. We apply rules or our skills automatically. This is rather simple
and concrete (after some training). (What is not simple - to do this
*in a good way*, to walk in a "right" direction to achieve some goal,
with the help of our intuition.) In this sense any metatheory (a theory
of how are we doing something or what is that are we doing - formal
derivations, etc.) is unnecessary in this situation. Taking this into
account, we have sufficiently solid ground for doing mathematics in
the ordinary rigorous way. The possibility that some of our formalisms
may be inconsistent has influence only on our intuition on imagined
entities or "worlds" described by these formalisms. This way we also
avoid vicious circle (which seemingly bothers your too much) of
describing natural numbers (which are something like syntactic objects
|,||,...) in terms of formal systems (also of syntactic nature).

>>Your further considerations (which I will not quote) are actually
>
>
> No, if you think this, then you have not read my article carefully enough,
> but have merely assumed on a superficial reading that it is just like many
> other articles on the topic that are superficially similar.

I understood you that if PA is inconsistent, then there is a problem
of doing any formal derivations (or meaningful derivations?) because
of the analogy of any formal rule with the successor operation
"generating" N where it is even unclear what is this N (and therefore
what is a formal system).

You wrote:

> I therefore argue that if an explicit PA-proof of 0=1 is found, then
> moving to something like I_Delta_0(exp) does not really salvage much,
> even if one is unable to find a proof of 0=1 in that system.  The first
> reason is that a PA-proof of 0=1 demonstrates a fundamental problem with
> our concept of the natural numbers, and so we won't be able to assign a
> coherent *meaning* to the sentences of first-order arithmetic any more.

Bounded Arithmetic, or I_Delta_0(exp) may be formulated in the
language with bounded quantifiers only or with a restricted
using of unbounded quantifiers. The meaning of such sentences
may be still *sufficiently* coherent for us (until a contradiction
will be found, if any, in these weaker systems).

> (Recall my remark about halfway plausible substitutes.)  Formal languages
> are interesting mostly because they capture some features of an entity
> that we're studying, and if the entity vanishes, then why bother with the
> formal language any more?

First, the (highly informal, vague and floating) entity (N)
does not vanish. Only our understanding and intuition on it
may be changed in some way.

Second, we actually need not the *general* concept of a formal
language. We need only be able to *use* formalisms based also
on some our intuition on what these formalisms are about. The
intuition could be vague. But we need not anything more. You
seems so EAGERLY want something more what is impossible in
principle.  This is AWFUL WORLD DISASTER for you which I feel

The second reason is that I_Delta_0(exp) is a
> formal system, and it won't be clear any more exactly what I_Delta_0(exp)
> *is*

Any formal system is a *finite object* (a system of generating
rules), assuming our *practical* ability to properly manipulate
with it. This is quite solid ground on which mathematics (or
its main feature - rigour) is essentially based.

(although I guess we could generate a finite fragment of it and check
> with each other that we agree on that finite fragment, or we could cross
> our fingers and hope that discrepancies between different human beings
> won't show up in practice).

Where from all these problems? Everything is so clear!

>
> Coming back to Process A and Process B, I believe that if one seriously
> maintains that we "don't know" if PA is consistent, then one should also
> maintain that we "don't know" if there is a string that both Process
A and
> Process B will generate.  This might seem like a surprising claim, since
> the relevant fact about Processes A and B can be formalized in systems
> much weaker than PA.  However, what I claim is that the very act of (1)
> setting up a weak formal system, including rules (rules!) of inference,
> and (2) asserting that theorems in this weak system "say" something about
> what Processes A and B "really do," presupposes the coherence of the
> concept of the natural numbers.  The inconsistency of PA undermines this
> presupposition.  In fact, it even undermines the presupposition that we
> know what Processes A and B *are*.

They *are* just two formal (and finitely presented) generating rules.
We know very well how to use them without invoking any philosophy,
metamathematics or anything else. I see no good reason to raise
these question in connection with possible(?) contadiction in PA.

>
>
>>The problem would be only how to change our intuition on natural
>>numbers. The same problem on sets was resolved quite efficiently (even
>>if only temporarily) after Russel's paradox. I see no essential
>>difference.
>
>
> My remark about nonstandard models of arithmetic was an attempt to
> illustrate an essential difference.  In the case of set theory, there
> are many candidates for replacing any particular version of set theory
> that we might use temporarily.  In the case of the natural numbers,
> there is no candidate in sight.  If you disagree, name one.

If you mean a "standard" model, then there is no such one both for
ZFC and PA. If you mean a replacement of PA (if it is inconsistent),
this could be probably PRA or Bounded Arithmetic or something like
that, where inconsistency proof of PA (if any) does not work so
that we (may be temporarily) can rely on these systems.

Actually, I do not understand what do you really want. I only feel
were answered by others. Unary strings (natural numbers), or strings
in any alphabet are things of the same nature (if to forget temporarily
some complexity issues). That is trivial. (I do not want to say that
various theories, say, on binary strings and the very concept
are trivial.)

I can agree that, for example, the arithmetical sentence Con(PA)
does not properly express our informal sentence "PA is consistent",
but not because it is not very natural to represent syntactical
concepts arithmetically. The problem is that, *in the context of
PA*, Con(PA) says something too strong in comparison with the
informal sentence "PA is consistent". Consistency means
impossibility to write *physically* a proof leading to 0=1.
PA "deals" with so big numbers and (codes of) so long syntactical
objects which do not correspond to anything physically existing in
our world.  Thus, the problem is not in arithmetical language, but
our intuition. Of course, Con(PA) may be considered as an
idealized extrapolation of the original informal meaning.
This is also interesting...

My suggestion is to think about what will happen with mathematics
and its concepts, such as N, (if a contradiction would be found)
from the point of view of mathematical practice. Working
mathematicians *use* both intuition and supporting it formalisms
(as we could notice this observing their work), but they hardly
(formalizing syntax, etc.) should not bother us too much, too.
This should be treated as a *separate* or even *secondary*
(possibly related) question. Why to head ourselves for a vicious
circle, when it is so easy to avoid this?

Kind regards,