[FOM] Non-standard analysis

Harvey Friedman friedman at math.ohio-state.edu
Tue Jul 29 19:50:10 EDT 2003


Reply to Urquhart Non-standard analysis. On 7/28/03 11:21 AM, "Alasdair
Urquhart" <urquhart at cs.toronto.edu> wrote:

> I posted the apparently paradoxical assertion that there are results in the
> theory of stochastic processes that can be proved only by non-standard
> methods.  This seems to conflict with well-known conservative extension
> results, so Steve Simpson and Harvey Friedman rightly called me on this.
> 
Yes, this is a matter of rather careless, or perhaps cleverly chosen
wording, on the part of those authors, as I suspected.
 
> I quote from the new monograph: "Model Theory of Stochastic Processes" by
> Fajardo and Keisler, (Lecture Notes in Logic), p. x.
> 
> "From the viewpoint of nonstandard analysis, our aim is to understand why
> there is a collection of results about stochastic processes which can only be
> proved by means of nonstandard analysis.  Our explanation is that these
> applications require adapted spaces with properties such as saturation and
> homogeneity which are easy to get using nonstandard methods but difficult or
> impossible without using them."
> 
They are talking about which conceptual framework to cast certain kind of
results in. They claim that some particular family of mathematical
structures are more suited to present some particular mathematical
phenomena.

This kind of argument and claim is normal in mathematics.

The reason that use of the word "foundational" here is rather confusing is
that there are really three quite different aspects to nonstandard analysis
that must be separated in order to avoid a lot of confusion.

1. Nonstandard analysis as a way of providing a model for the informal
reasoning that was prominent among mathematicians after the calculus but
before the advent of epsilon-delta (or arguably predating the calculus).

2. Nonstandard analysis as a development in standard mathematics which
purports to simplify proofs in standard mathematics, replacing them with
simpler proofs in standard mathematics.

3. Nonstandard analysis as a certain approach(es) to certain specialized
branches of mathematics, replacing certain standard theorems in certain
settings in standard mathematics with alternative standard theorems in
alternative settings in standard mathematics.

The aspect that has any claim to be among the great foundational ideas of
the 20th century, as Urquhart feels, seems to be 1. However, that aspect has
not proved to be all that foundationally illuminating. Mathematicians still
think that epsilon-delta was a vast improvement over informal thinking about
infinitesimals, so, foundationally, there is a great preference for normal
foundations.

I think I know what Godel might have been anticipating, from reading his
remarks - I will take this up below.

If Godel's vision about this were to come about, then indeed the claim for
the foundational importance of nonstandard analysis via 1 would be much much
stronger. 

With 2, there is definitely some activity - I think that some of Gromov's
work is to be looked at. But the nature and body of 2 is not as strong
as a number of other 20th century items in mathematical logic, even in the
realm of 2.

With 3, all of it is, I think, extremely controversial. First of all, as I
said, the vast preponderance of attention in mathematics today is in the
realm of rather concrete contexts and structures. The kind of
nonconstructivity involved in the kind of abstract nonstandard analysis we
are talking about, with ultrapowers and saturation, etc., is SEVERAL orders
of magnitude beyond what is involved in, say, the nonconstructivity
surrounding the normal fundamental existence theorems about one variable
(uniformly) continuous functions from the closed unit interval into itself
taught in a beginning course in real analysis.

In the long run, mathematicians demand some sort of concrete approximations
usable for actual applications on computers, especially in their probability
theory and their (partial) differential equations, and here obviously one
cannot be using nonstandard analysis - at least in any directly recognizable
form. One has to use, perhaps, some sort of finite form of nonstandard
analysis - but this is likely to look like totally standard combinatorics or
combinatorial analysis, not nonstandard analysis in the sense we are talking
about.

> 
> So, the explanation for the apparent paradox is this. Standard stochastic
> process theory is interested in the properties of distributions, but does not
> much care about the space where these processes live. Hence, we can get better
> results by using things like saturated adapted spaces constructed by the Loeb
> construction.
> 
What does "better" mean here? Conceptually clearly and more illuminating,
probably. Again this must be a controversial matter. And of course the most
illuminating and important of all is a practical interpretation.

> In other words, nonstandard analysis provides us not just with short proofs of
> old theorems,
> 
Not nearly enough to justify extravagant claims. But it would be interesting
to analyze the cases of this to see what is involved in removing it.

> but also with new and interesting objects, that help in proving new results.
> 
The new results will presumably be in the new setting, which is, I assume,
highly controversial. But I should add: the new setting is part of standard
mathematics. 

The problem with the new settings is that they are not definable, and so it
is difficult to see how they could possibly connect up with reality. I.e.,
when one finally wishes to compute things, then what is gained by using
undefinable settings?

What does finite nonstandard analysis look like? Obviously this is an
absurdity if taken literally. Perhaps one can use ideas surrounding
nonstandard analysis to create new interesting and important finite
mathematics.

> I stoutly adhere to my assertion that nonstandard analysis is a major
> foundational advance.
> 
That is a weaker statement. I stoutly adhere to my assertion that it is in a
completely different category than the handful of really big foundational
advances in the 20th century.

> Of course, Goedel and Turing are among my heroes, but so is Abraham Robinson.
> 
There is absolutely no comparison with what has emanated from Godel and
Turing and what has emanated from nonstandard analysis.

On the other hand, A. Robinson did some other significant things, so that
his total career is significantly more than his work on nonstandard
analysis.

But we are not talking about A. Robinson, we are just talking about
nonstandard analysis.

> I can call to my defence no less a person that Kurt Goedel himself.  Goedel
> wanted Robinson to be his successor at the Institute for Advanced Study, and
> in addition, allowed Robinson to publish some very interesting (and highly
> positive) remarks of his on non-standard analysis (see his Collected Works,
> Volume II, pp. 307-311).  These remarks (which Robinson printed in the 2nd
> edition of his "Non-standard Analysis") are the published version of comments
> that Goedel made after a talk of Robinson at the IAS. Some of the background
> to these remarks is explained in a very funny and illuminating anecdote of
> Bill Howard that is published in the interview by Amy Shell-Gellasch in Volume
> 25, No. 1 of the Mathematical Intelligencer (p. 41).  I should add that the
> whole interview with Bill Howard is a real treat.
> 
> 

I think I can take a guess as what Godel might have been anticipating. I
think that Godel may have thought that nonstandard analysis and new axioms
of set theory were somehow connected. Perhaps Godel thought that if one
thinks nonstandardly, taking seriously infinitesmials, then one might be
able to come up with new axioms that allow one to prove new concrete
statements. You can more or less see this in the reference you cite.

Of course, one cannot see from the cited brief note, whether Godel had in
mind some acceptance on the face of it of nonstandard notions as new
notions, or rather what they generate in standard ideas when you interpret
them standardly, as A. Robinson did.

In particular, one would like to be able to derive or at least prove the
consistency of certain large cardinal axioms either thinking nonstandardly,
or thinking about standard interpretations nonstandard ideas.

I don't see how to do this, but I have made a step towards this in the
context of general nonstandard mathematics.

More specifically, let us back us to one way of putting the main idea of
nonstandard mathematics in broad terms.

Suppose we are given a relational structure M. It turns out that we can
properly extend the structure M to a structure M' so that M and M' are in
some way similar.

Normally, this is stated in terms of first order elementary extensions.
Every structure has a proper first order elementary extension.

But there is an even more elementary, or if you like, algebraic, way of
stating this. If we are going to be algebraic, we might as well restrict
attention to what we call algebras. These are a nonempty set together with a
finite list of constants and functions.

An extension of an algebra is just an algebra with more elements, which
agrees with the original algebra at objects in the original algebra.

An infinite algebra is an algebra with an infinite domain.

THEOREM. Every infinite algebra has a proper extension with the same finite
subalgebras up to isomorphism.

The above is proved using compactness or ultrapowers, as in A. Robinson
does. Of course A. Robinson freely used first order predicate calculus and
related it directly to analysis.

But nevertheless, the above Theorem has the same flavor. E.g., it cannot be
proved without the axiom of choice, and even if the given infinite algebra
is rather explicit, no extension need be.

Now what if we want to preserve more properties?

THEOREM? Every infinite algebra has a proper extension with the same
finitely generated subalgebras up to isomorphism.

The above is FALSE. However:

PROPOSITION. Every sufficiently large algebra has a proper extension with
the same finitely generated subalgebras up to isomorphism.

THEOREM. The Proposition is equivalent, over ZFC, to the existence of a
measurable cardinal. The "break point" in the statement is the first
measurable cardinal.

On the language side, one can instead use, e.g.,

"satisfying the same sentences of weak second order logic"
"being an elementary extension for weak second order logic"

and also get the same equivalence with measurable cardinals.

If we use second order logic, or elementary extensions for second order
logic, then we get an equivalence with what is called extendible cardinals,
which are far greater than measurable cardinals, and in fact much bigger
than lots of Woodin cardinals, or what is needed to prove projective
determinacy. Also, the braek point is the first extendible cardinal.

There is a whole complex of such results, all related, stated in

H. Friedman, Restrictions and Extensions,
http://www.mathpreprints.com/math/Preprint/show/index.htt

and also see 

H. Friedman, Working with Nonstandard Models,
http://www.mathpreprints.com/math/Preprint/show/index.htt

Harvey Friedman













More information about the FOM mailing list