FOM: Length of formalizations

Harvey Friedman friedman at
Thu Jan 13 21:36:41 EST 2000

Reply to Rudnicki Mon, 3 Jan 2000.

>There is an ongoing effort in Mizar to formalize "A Compendium of
>Continuous Lattices" by G. Gierz et al., Springer 1980.  In 1997,
>together with Cz. Bylinski, we formalized pages 105-108 of the book
>about the Scott topology (the formalization of the earlier material
>was available to us).  The text in the book includes definitions,
>theorems with proofs and comments; it uses about 2000 tokens (words
>and symbols).  The Mizar formalization uses about 15500 tokens.  About
>4500 of these concern (simple) facts that in the book are not proved
>but used.  Thus the 2000 tokens from the book required about 11000
>tokens in the formalization.  (These numbers should be treated only as
>a rough measure, they are based on what is meant by a token).  I am
>in no position to judge how deep and complicated is the mathematics
>involved but the length of the formalization was much smaller than we
>initially expected.
>The Mizar formalization is available at:

This is very impressive news, just the kind of news I was looking for. Of
course, "A Compendium of Continuous Lattices" is unlikely to be like
Fermat's Last Theorem, but nevertheless this is very encouraging.

Of course, the next question is what happends with more far ranging animals
like, say, Rudin's Undergarduate Real Analysis Text? Or Landau's
Foundations of Analysis? Can you handle a text that is currently used in
the undergraduate mathematics curriculum? Are we going to get away with a
modest constant factor blowup?

More information about the FOM mailing list