[FOM] Misuse of standard terminology

Harvey Friedman hmflogic at gmail.com
Wed Aug 28 06:05:32 EDT 2013


The misuse of standard terminology with fixed meanings going back decades
or even centuries or even longer, is an interesting phenomena. Normally,
this is done to gain special attention for some topic or another by jolting
the reader, and conveying a sense of "modernism" or "revisionism".

I see some cases of this on the FOM now. Some are clear cut. Others are
not.

A clear cut case is this:

"Absorption (P∧(Q∨P) =P) is not considered to be a standard Boolean
equivalence. And, of course, P does not in general infer Q∨P, which is used
in C. I. Lewis' proof. See http://arxiv.org/abs/0812.4852 "

Obviously (P and (Q or P)) implies P considered by everybody to be a
standard Boolean equivalence, and obviously from P everybody can infer
Q or P. These are the standard uses of the notions that go back for
thousands of years.

Of course, one can invent some new notions of and, or, if then, iff,
"Boolean equivalence", etcetera, and that is fully legitimate -
although it may or may not be interesting depending on what is done
with them, and what insight is being drawn.

What is completely illegitimate is to usurp the usual symbols and
terminology. The symbols and terminology must be altered in order to
avoid any possible confusion. E.g., one might use and*, or*, ifthen*,
iff*, Boolean equivalence*, to avoid confusion.

A more problematic case is that of Tennant on Core Logic.
http://www.cs.nyu.edu/pipermail/fom/2013-August/017528.html

My initial impression is that something like Core Logic might have
significance for the foundations of mathematics, but the key results
that would indicate this are simply not known, or even formulated.

Ultimately it may not exactly be Core Logic that might emerge as a
significant player in f.o.m. It might be "something like Core Logic".

So I will abbreviate "something like Core Logic" by simply "Core Logic".

I think a good direction is to see what happens if one provides a
universal foundation for mathematics in the sense of ZFC using Core
Logic.

As I understand it, Core Logic is presented by Tennant as a restricted
form of logic where thinning on the left and thinning on the right are
disallowed. More specifically, adding assumptions on the left
(thickening), and deleting conclusions on the right (thinning), are
both "invalid" or "rejected". Also, what is disallowed is other rules
of inference in common use that are associated with thinning and
thickening.

This is a prima facie case of misuse of standard terminology.

Instead of suggesting that there is something "wrong" or "invalid"
with the obviously valid thinning and thickening rules of logic, and
other associated rules, a much better move is to suggest that

***We keep track of thinnings and thickenings and all related devices
in mathematical proofs."

In particular, what I am suggesting is that Tennant's focus here is
really part of a wider subject which is really interesting and
important for f.o.m.:

***What are the significant features of actual mathematical proofs?"

Now this question has in the past lead to undeniably interesting and
deep developments. Here are two significant features of proofs that
are of great importance. Let us assume that we are talking about ZFC
formalizations.

1. Which of the axioms or axiom groups of ZFC are being used in the proof?

2. Are proofs by contradictions, uses of law of excluded middle, and
"related moves" being used in the proof? If so, in what way? Keep
track of them.

3. How long is the proof? Roughly how long is the proof? With respect
to what notions of size?

Numbers 1,2 have been exploited fruitfully both with specific examples
and theoretically. 1 leads ultimately to Reverse Mathematics and other
developments, and 2 leads to intuitionistic logic, double negation
translations, the so called Friedman A translation, various
completeness theorems for intuitionistic logic, none of which are
fully satisfactory yet, and so forth. 3 so far has only been developed
theoretically. We don't yet have the tools to determine rough least
lengths of proofs of actual theorems of mathematics.

So what about, in analogy with 2,

4. Are thinnings, thickenings, and other anti relevant moves being
used? If so, in what way? Keep track of them.

I trust that Tennant can compare 4 and 2 and at least provide us some
theoretical results of a similar nature. E.g., with 2, we know that
any proof of a Pi01 (Goedel) or even Pi02 (me) in appropriate ZF can
be proved in appropriate intuitionistic ZF.

This leads to the question:

***But is 4 significant for f.o.m.?***

It is clear that 2 is significant for f.o.m. and also of course 1 is
as well. Also, the theoretical work on 3 going back to Goedel, is of
significance for f.o.m. In this limited posting, I am not going to get
into explaining just why. Suffice it to say that there is a web of
connections between 1,2,3 and various foundational purposes.

Another way of asking this question is this:

***If a proof makes no use of moves in 4, then what additional
mathematical information does that provide?***

This has very significant answers in connection with 1,2. With 2,
there is the algorithmic conclusions, started by Kleene with his
recursive realizability, which has been extended to intuitionsitic ZF
(me).

Harvey Friedman
-------------- next part --------------
An HTML attachment was scrubbed...
URL: </pipermail/fom/attachments/20130828/37f351f3/attachment-0001.html>


More information about the FOM mailing list