[FOM] the essences of (nonclassical) logics (it was: mismatch)

giovanni sambin sambin at math.unipd.it
Tue Jun 17 10:04:57 EDT 2003


I take up Harvey Friedman's plea for a wide discussion on "the 
essences of nonclassical logics" (the addition of brackets in the 
subject is mine!).
In fact, since I have been working for several years on the field, I 
feel the duty to contribute with my opinions and some references to 
the discussion, which was started by the recent posting by Guglielmi 
on the "mismatch".

At 4:23 +0200 15-06-2003, Alessio Guglielmi wrote:
>There is no mismatch, because, when going bottom-up, the connectives 
>at the object level (^, V) become corresponding structural relations 
>at the meta level

The starting principle of Basic Logic (introduced in G. Sambin, G. 
Battilotti and C. Faggian, Basic Logic: reflection, symmetry, 
visibility, J. Symbolic Logic 65 (2000), pp. 979-1013) is exactly 
that any connective is "created" by solving a "definitional equation" 
saying that a meta-level link is reflected in equivalent terms by a 
connective at object-level (see examples below). In Basic Logic (as 
well as in all its extensions by structural rules which include 
exponential-free Linear Logic, Intuitionistic Logic and Classical 
Logic, besides some of the Quantum Logics) any connective is 
introduced by the principle of reflection. In this way there is no 
danger of "mismatch". The source of mismatch in Guglielmi's first 
example is, for what I can see in a minute, the use of one-sided 
sequents.

At 4:23 +0200 15-06-2003, Alessio Guglielmi wrote:
>2   THE MISMATCH HAS UNDESIRABLE PROOF THEORETICAL CONSEQUENCES

I fully agree with this. On the positive side, I expect that one can 
prove a general theorem by which any logic introduced following the 
principle of reflection will automatically satisfy cut-elimination. 
At the moment, what has been achieved through Basic Logic is 
***one*** cut-elimination procedure which applies to ***all*** logics 
L mentioned above. Note: one and the same procedure will work 
simultaneously: roughly speaking, you through in any proof in a logic 
L, and the outcome will be a cut-free proof -with the same 
endsequent- in the same logic L. See C. Faggian, G. Sambin, A unified 
approach to logical calculi and normalization of proofs, Lecture 
Notes for the 11th European Summer School in Logic, Language and 
Information (ESSLLI), Utrecht University 1999

At 4:23 +0200 15-06-2003, Alessio Guglielmi wrote:
>2.3   There is a simple philosophy of Gentzen's sequent calculus, 
>for which, grossly speaking, the `meaning' of connectives is defined 
>by their rules, where connectives are `brought to the meta level' 
>with no reference to the context, according to some 
>harmony-preserving principles.

The idea underlying Basic Logic and the principle of reflection is to 
pass from rules to what I call the definitional equation of a 
connective. The definitional equation says that the meta-level link 
and the object-level connective perfectly correspond to each other. 
Inference rules are *deduced* from the definitional equation. Thus 
the meaning of a connective is moved from rules to the definitional 
equation.

Let's see two examples to clarify.

Notation:
A,B,... are (atomic) propositions
A is, B is, ... are generic judgements (like A is true, A is available, etc.)
We only need two meta-level links, *and* and *yields*. These will be 
enough to "create" all connectives.
The usual Gentzen notation for sequents can be seen just a shortening 
of compound judgements using ***only*** the links *and* and *yields*. 
For example,
C1,...,Cm |- D1,...Dn is short for
(C1 is *and*...*and* Cm is) *yields* (D1 is *and* ... *and* Dn is).
The meaning of *and* and *yields* is explained by what we assume 
about them, that is only:
identity:  A |- A
composition, or cut:  Gamma |- A     Gamma', A |- Delta
                       ---------------------------------
                           Gamma, Gamma' |- Delta
(+ the symmetric of this)
conjunction: from A is *and* B is one can infer both A is and B is, 
and conversely.

[Since this differs from usual, please note that there is no need of 
a link of the kind *or*; see the example of the usual disjunction A v 
B below.]

The connective * called times in linear logic is introduced as the 
solution of the definitional equation:

(1) A * B |- Delta iff A,B |- Delta

This equation as it stands is still a "wish". To solve it, we find 
acceeptable inference rules for * which taken together are equivalent 
to (1).
One direction of (1) is already acceptable:

*-formation:    A,B |- Delta
                 --------------
                 A * B |- Delta

The other direction is not good as it stands:

*-implicit reflection:   A * B |- Delta
                          --------------
                          A,B |- Delta

since it assumes * to be known (imagine you have to instruct a robot 
by giving inductive rules). However, one can easily find equivalent 
but acceptable formulations of *-implicit reflection. First, one must 
trivialize its premiss by choosing Delta = A * B. Then one obtains:

*-axiom   A,B |- A * B

Then one applies composition to the axiom and the premises
Gamma |- A and Gamma' |- B to obtain (note: Gentzen's blank between 
premises is to be read as *and*):

*-explicit reflection:   Gamma |- A     Gamma' |- B
                          ---------------------------
                             Gamma,Gamma' |- A * B

Not yet finished: we now have to show that *-explicit reflection is 
actually equivalent to what we started from. By trivializing it 
(Gamma=A, Gamma'=B) we obtain the *-axiom back, and from the *-axiom 
by composition we obtain *-implicit reflection.

In conclusion: the two rules *-formation and *-explicit reflection 
are the good inference rules for *, and they are ***deduced*** from, 
and actually equivalent to, the definitional equation.

Example of disjunction v:
The definitional equation is

(2)  A v B |- Delta iff A |- Delta *and* B |- Delta

Note: the connective v is formed on the left! That is why we don't 
need a link for "or". The equation (2) is solved following exactly 
the same pattern as above. The result will be well-known rules:

v-formation:   A |- Delta      B |- Delta
                ---------------------------
                    A v B |- Delta

v-implicit reflection:   A v B |- Delta    (+ the same with B)
                          --------------
                            A |- Delta

v-axiom   A |- A v B

v-explicit reflection:     A |- Delta
                          --------------
                          A v B |- Delta

The first and last rules are the "good" ones.
Note that this gives a better explanation of the symmetry between v 
and &. The definitional equation for & is

(3)  Gamma |- A & B iff Gamma |- A *and* Gamma |- B

which is symmetric to (2) in the sense that the roles of left and 
right (of the sign |-) have been switched. This gives a symmetry at 
the level of meaning (the definitional equations) and not only at the 
level of the formal system, as usual.

Last remark: by adding contexts on the side of active formulae, one 
obtains definitional equations with two parameters (typically, Gamma 
and Delta), whose solutions give Linear Logic. Adding contexts only 
at the left, but adding moreover structural rules of weakening and 
contraction, one obtains the rules for intuitionistic logic. Finally, 
adding everything one obtains classical logic.

All details can be found in the JSL paper mentioned above.

At 4:23 +0200 15-06-2003, Alessio Guglielmi wrote:
>In fact, there is a value in having simple deductive systems, all 
>designed according to the same methodology. Having to tailor the 
>shape of sequents to each logic requires an effort, both for the 
>proof theorist and for the user of the deductive system, who has to 
>learn a new metalanguage for every system.

I obviously propose the methodology briefly explained above. It gives 
a single framework for all "good" logics. (Where "good" here is of 
course a bit partisan: it means that it is obtainable by the 
principle of reflection).
Note that I have nothing to say about exponentials or modalities. It 
seems that the principle of reflection is not useful in this case.

At 4:23 +0200 15-06-2003, Alessio Guglielmi wrote:
>As I said above, the standard way of dealing with the mismatch is to 
>enrich the structure of the meta level. The idea is simple: if 
>branching and commas are not enough, let's supply some more 
>structure.
[...]
>As far as I know (and please help me because I feel rather insecure 
>here) one strong bid in this direction comes from Belnap's display 
>calculus.

Belnap's Display Calculus is not a good solution, in my opinion. In 
fact, its "cut-elimination" does not produce a good subformula 
property, and hence doesn't give decidability and a proof-search 
procedure as a corollary. In philosophical terms, perhaps one can say 
that display calculi are not always "analytic".
The spirit of Basic Logic is the opposite: rather than introducing 
new meta-level links at will (often with no intuition about them), it 
reduces links to a minimum.

I have no comments at the moment on Guglielmi's proposal of "deep 
inference", except for:

At 4:23 +0200 15-06-2003, Alessio Guglielmi wrote:
>The biggest advantage that deep inference brings into the picture is 
>top-down symmetry for derivations.

In Basic Logic, all inference rules come in symmetric pairs: for 
example, &-formation is perfectly symmetric to v-formation.
This has been exploited to obtain a neat cut-elimination procedure 
for orthologic (one of the quantum logics), see C. Faggian, G.Sambin, 
 From Basic Logic to quantum logic with cut-elimination, International 
J. of Theoretical Physics 37 (1998), pp. 31-37

At 15:47 -0400 15-06-2003, Harvey Friedman wrote:
>In particular, my knowledge of linear logic is restricted to the following:
>
>1. It is at least originally motivated by trying to model the 
>tracking of resources [...]
>
>2. Nobody is claiming to have any completeness theorem associated 
>with 1, that goes to the heart of the purpose of linear logic.
>
>So I, for one, paid no attention to linear logic, assuming that item 
>2 is the principal open question. I was waiting to see some 
>particularly coherent and attractive expository or otherwise article 
>about 1,2 so that I would have a crack at getting to the bottom of 
>what an appropriate completeness theorem would look like.
>
>I have been seriously involved in completeness proofs of 
>intuitionistic logic over the years, and for various reaons, 
>intuitionistic logic has, from many points of view, proved its 
>permanent value.

Long before Basic Logic, I gave a uniform proof of completeness for 
linear logic (with no exponentials!), for intuitionistic linear logic 
and for intuitionistic logic (besides classical logic, of course) 
which could be of some help here. It uses the notion of pretopology, 
which can be given some heuristic meaning. Technically, the proof is 
quite simple and apparently the first to use only predicative 
arguments. See G. Sambin, Pretopologies and completeness proofs, J. 
Symbolic Logic 60(1995), pp. 861-878.
Since the writing of this paper I know that it would be intereesting, 
and nice, to find the connection with Friedman's proof for 
intuitionistic logic.
It goes without saying that I agree with him on the permanent value 
of intuitionistic logic.

At 15:47 -0400 15-06-2003, Harvey Friedman wrote:
>The standards for introducing a new logic and having it attain 
>permanent value are quite high.

I agree wholeheartedly.
In fact, I decided to write about Basic Logic only after realizing 
how much it helped me (and the people to whom I could explain it long 
enough) to understand better what a logic is in general. And of 
course we proved for it at least a solid cut-elimination procedure, 
with subformula property etc.

At 15:47 -0400 15-06-2003, Harvey Friedman wrote:
>In fact, we need a major new thread in FOM called something like 
>"the essences of nonclassical logics". Here subscribers to FOM could 
>take the point of view that all nonclassical logics are guilty 
>before they are proved innocent.

I do not agree here. I believe that all logics are equally guilty 
before one justifies them, including classical logic. I see a logic 
just as a stock of inference rules which preserve information given 
by judgements of a certain kind. For classical logic the judgement 
is: A is true, and the information is that A is true (in some kind of 
ultra-subjective sense); in intuitionistic logic the judgement is the 
same, but the information is that one can prove A to be true; in 
linear logic the judgement is, in my interpretation, that A is 
available as a resource, and the information is evidence for this. 
Unfortunately, I still have to understand better what is the 
interpretation for basic logic. But its role has been to pull out a 
general structure, or abstract understanding.

For the FOM reader interested also on my speculative views on the 
foundations of logic and of mathematics in general (with few 
formulae), I suggest here reading G. Sambin, Steps towards a dynamic 
constructivism, in: In the scope of Logic, Methodology and Philosophy 
of Science, vol. 1, P. Gaerdenfors, K. Kijania-Placek and J. Wolenski 
(eds.), Kluwer 2002, pp. 263-286 (this is volume 1 of 2  with the 
Proceedings of the XI LMPS congress, Krakow 1999) and the other 
papers cited therein (unfortunately, still in Italian).

At 15:47 -0400 15-06-2003, Harvey Friedman wrote:
>Now in order to make this thread on the FOM effective, we need
>
>[...]
>b. Gugliemi to be prepared for the possibility that FOM subscribers 
>may be unconvinced of the value of linear logic, even to the point 
>of coming to the conclusion that it is of no value.

I am ready to answer also to the most negative critics, if expressed 
in a civilized form, and if compatible with what my time allows me (I 
am working now on the translations into basic logic).

At 14:22 +0200 16-06-2003, Alessio Guglielmi wrote:
>The fact that linear logic deals with `resources' is totally 
>irrelevant in my opinion.

I do not agree, even perhaps against Girards' opinion. But it seems 
that Guglielmi himself doesn't agree with himself, given his next 
explanation in his section "Linear logic and computer science".

At 14:22 +0200 16-06-2003, Alessio Guglielmi wrote:
>1   There are complete semantics for linear logic and fragments of 
>it, including model theoretic semantics. One such semantics is 
>`phase semantics', which you can find in [1, 2].

In the paper mentioned above ("Pretopologies...") I show that phase 
spaces are *exactly* the same as pretopologies in which the law of 
double negation is valid.

At 14:22 +0200 16-06-2003, Alessio Guglielmi wrote:
>3   I'd expect FOM subscribers to consider linear logic an important 
>mathematical idea, not necessarily a foundational one, and not to 
>expect semantics for it in the same spirit of semantics for 
>classical logic, because there isn't such a need.

I agree. For Basic Logic there is no need of a mathematical 
semantics, since the explanation given for inference rules is -in my 
opinion- fully satisfactory, also from a philosophical point of view. 
Still, it seems to be possible to give a simple complete mathematical 
semantics for basic logic; work on this is in progress.

Lastly, a small remark on the topic of Girard's ludics, with which 
however I must confess I have essentially no experience. We should be 
grateful to Andre Scedrov for his short lucid explanation (as his 
standard).

At 23:56 -0400 16-06-2003, Andre Scedrov wrote:
>Ludics is aiming to answer a deeper question: what are the general invariance
>principles from which logical rules come about? Imagine having a conversation
>with Gentzen, or perhaps even Aristotle, asking them just why and how did they
>arrive at such and such a formulation of logical rules. Was there a method?
>Are there structural restrictions that one must observe? Was Gentzen somehow
>forced to give his particular answer? That is, is there the logic of rules?

As should be clear from above, I propose the approach of Basic Logic 
as useful in this respect. It clearly could be understood by Gentzen, 
if not also  by Aristotle; actually, I wonder why Gentzen did not 
speak about it! (One reason could be that he first hd to realise 
about linear logic.) With a warning: basic logic goes in a direction 
opposite to that of Girard. He looks for mathematical objects, I look 
for (philosophical?) explanations. According to him, this is still 
part of a "jurassic era". As we still are good friends beyond 
scientific disagreement, in the same way I hope that one day one can 
find a unitary theory, giving both mathematical objects and 
convincing general explanations. In technical terms, perhaps the 
mathematical invariants underlying basic logic could be close to the 
objects of ludics. Finding this would be in my opinion a major 
advancement in the understanding of proofs, and of logic in general.

I apologize for my stiff (that is, not colloquial enough) English.

Giovanni Sambin

**************************************
Giovanni Sambin
Professor of Mathematical Logic
Dept. of Mathematics, Univ. of Padua (Italy)
interests related to FOM: weak logics, predicative mathematics 
(formal topology in particular), philosophy of mathematics


More information about the FOM mailing list