FOM: relevance logic and complexity calibration

Neil Tennant neilt at
Tue Apr 21 12:23:15 EDT 1998

Harvey Friedman wrote:
As far as open mathematical problems are concerned, what Shipman is asking
us to do is this: given an open mathematical problem A, find (n,m) and a
sentence B in class Sigma^n_m or class Pi^n_m, and a proof that A is
equivalent to B in ZFC. One answer is better than another if the complexity
class of one is properly included in the complexity class of the other. The
proper inclusion relations are only that Sigma^n_m and Pi^n_m are both
properly included in each of Sigma^r_s and Pi^r_s if (n,m) is
lexicographically earlier th[a]n (r,s).

Now what about an existing mathematical theorem? Obviously, it is provably
equivalent to a sentence in the lowest levels of all: (for all x in
V(omega))(x = x), and (there exists x in V(omega))(x = x).

What we need is a notion of intrinsic complexity. Here we need reverse
mathematics, or the approach of reverse mathematics. 

I wonder whether this isn't a case where the Shipman-intuitions could
be accommodated by appealing instead to a notion of *relevant* logical
implication? A mathematical theorem T and either of the quantified
sentences (Qx in V(omega))(x=x) are "logically equivalent" modulo ZFC
only because each is a theorem of ZFC, rather than because of the fact
that each features as a relevantly used premiss within a deduction of
the other as a conclusion (modulo ZFC).

Suppose C is to be some "complexity diagnostic" Pi^n_m or Sigma^n_m
sentence for some mathematical theorem T. Couldn't C be chosen
non-trivially by insisting that
(i) there should be a relevant deduction (modulo ZFC) of T from C
(ii) there should be a relevant deduction (modulo ZFC) of C from T ?

Harvey's reaction to this suggestion has been: "It seems to me that
this is a good idea if and only if a typical serious mathematical
theorem can be proved with relevant deductions only."

Now there are at least two senses of "relevant deduction" to which one
might appeal here. I think one of them, but not the other, will
do. The non-starter would be the sense in which a relevant deduction
of Q from P_1,...,P_n is a deduction within some system like the
Anderson-Belnap system R, which does not contain disjunctive
syllogism.  In this sense of "relevant", we do not have an appropriate
conservation theorem guaranteeing that results provable in the
standard way have closely enough related versions provable in the
"relevant" way. Indeed, I believe it is still an open problem whether
the four squares theorem is in R# (the relevant closure of Peano's
axioms, in this sense of "relevant"). This challenge to the relevant
logic community was posed by Friedman and Burgess some time ago, and
has not yet been met.

This other sense of "relevant" that I think *will* work is given by
the notion of a cut-free, dilution-free proof in a sequent system with
appropriate logical rules for introducing logical operators on the
left and/or on the right of a sequent. For such a system (call it CR)
we have the following metatheorem:

	If there is an ordinary classical proof of the sequent X:Q
	then there is proof in CR of Y:Q or of Y:emptyset, for some
	subset Y of X.

Note that I am not claiming that proofs in CR will be as economical as
ordinary proofs using cuts. We do know, however, that cuts can be
eliminated (Gentzen's Hauptsatz); and then a linear transformation
will turn the resulting cut-free proof into a cut-free, dilution-free
proof, in CR, of some subsequent of the original result. Only if a
premiss survives this process can it said to be relevant to the
conclusion of the sequent.

Thus my proposal for "complexity calibration" of any theorem T will be
that for the complexity-diagnostic sentence C there should be proofs
in CR of two sequents of the form

	X,T:C	and	Y,C:T

where X and Y are subsets of the set of axioms of ZFC. In the first
proof (of the sequent X,T:C) the premiss T will "really be doing some
work, modulo X" to yield the conclusion C. Likewise, in the second
proof (of the sequent Y,C:T), the premiss C will "really be doing some
work, modulo Y" to yield the conclusion T.

Neil Tennant

More information about the FOM mailing list