FOM: The instrumental value of problematic notions and principles

Solomon Feferman sf at Csli.Stanford.EDU
Tue Dec 2 03:11:19 EST 1997


This continues my response to John Steel and others about parts of my
paper "Does mathematics need new axioms?".  Incidentally, the final
version of that is available by anonymous ftp at 

  ftp://gauss.stanford.edu/pub/papers/feferman   as   newaxioms.ps.gz 

There are no substantive changes from the previous version, but the
exposition is a bit improved thanks to the editorial work of the Amer.
Math. Monthly, where it is to appear next year.  I have added some brief
notes about KT and EKT.  Also, after explaining that all or almost all
present-day scientifically applicable mathematics can be formalized in a
system [W] conservative over PA, I have added the following sentences,
prompted by earlier discussion:

"I am not by any means arguing that everyday mathematical practice should
be restricted to working in such subsystems [of analysis].  The
instrumental value of 'higher' and less restricted set-theoretical
concepts and principles is undeniable.  The main concern is, rather to
see: what, fundamentally, is needed for what?"

Let's take the case in point of CH, the subject of several recent
messages.  Here's an old story.  In the mid-60s, Ax and Kochen proved the
decidability of p-adic fields assuming AC and CH; the latter entered
through the use of saturated model arguments.  It followed by the
absoluteness of the statement of decidability that that could be proved in
just plain ZF.  Now that was just decidability "in principle".  A few
years later, Paul Cohen gave an explicit decision procedure; exit ZF.
And a few years after that (I don't have the exact dates at hand), Barwise
and Schlipf showed that you could use recursively saturated model
arguments in place of sat. model arguments more generally for a variety of
applications.  This put us back in set theory, but very minimally; I don't
think they said anything, but I expect their work could be formalized in
ACA_0, which is conservative over PA.

Now how does that fit with my view that CH is inherently vague?  First, in
response to Neil Tennant's most recent criticisms of that, I must make
plain that it is so on count (1): "it does not have a precise sense".
Nevertheless, as the above example shows, entertaining it could still be
useful.  I expect someone would eventually have come to a decision
procedure for p-adic fields without the detour through CH, but the fact is
that through the development of set-theoretical model theory, one was led
more quickly to expect it.  

I don't understand Tennant's alternative (2): "it has a precise sense, but
it involves certain predicates (such as '...is bald') whose (precise)
sense is such that they admit 'borderline cases'."

In that case, I would say of a statement that it does not have precise
sense, and so it falls again under (1).  To compound my non-understanding,
Tennant says something like: Well, CH is formulated precisely in the
language of set theory, and Feferman accepts some statements of set
theory as having definite meaning, so how can he not accept CH likewise?
John Case has basically answered this.  We recognize certain statements as
having definite meaning prior to their being formalized in set theory. In
particular, I accept all statements of first order number theory
(including Goldbach's Conjecture GC) as having definite meaning.
Do we really need set theory to understand what they mean?  Do we need to
know the precise range of the variables in the cumulative hierarchy?
That's really a topsy-turvy view of things.

Incidentally, I don't agree with Case that statements of second-order
arithmetic in general have a definite meaning.  But lots of statements
about the real numbers have definite meaning because they don't depend on
the exact range of the variables.  In particular that holds for the
algebra of reals, even extended by lots of definable analytic functions
beginning with exponentiation.  

Infinitesimals provide the paradigm case of a problematic notion which has
highly instrumental value both historically and through the present day in
physical modeling and lots of mathematics (differential geometry, etc.),
and all this without the benefit of foundations.  It's comforting to know
that there are theories of infinitesimals which are conservative over
theories with standard models, thus justifying those uses.  But the
non-standard models themselves are not canonically involved in that
justification.  I've taught calculus all the way from the dx, dy days to
the epsilon-delta days, and the former are always more readily convincing
than the latter to beginning students.  We don't go wrong that way because
we're only doing things that can be justified in the end, one way or
another.  

Another time I will take up the question, why emphasize scientifically
applicable mathematics?  But for the moment, just take note that that
includes a lot of "pure" analysis either directly or right behind it.

Sol Feferman






More information about the FOM mailing list