FOM: Altavista and Chu spaces

Vaughan Pratt pratt at cs.Stanford.EDU
Mon Mar 16 14:46:43 EST 1998


... but if number of hits on Altavista matters that much to Franzen
then he should be impressed by Chu spaces, which got 81 hits to subtle
cardinals' 5 hits.  (Among these, or rather among the additional 20
hits on "chu space" singular, was a page by game author James
Chu, Space Hulk's developer, but the rest seemed on target.)

A quick plug for Chu spaces: they are an alternative way of representing
relational information.  Instead of listing the ways things associate,
such as (2,3,5) as a tuple of a ternary relation meaning 2+3=5,
one lists the ways things can dissociate.  This is how topological
spaces are organized, but it also turns out to be the principle behind
the definition of Hilbert space as an inner product space, and the
representation of Abelian groups in terms of their group characters,
to just scratch the surface.

Chu spaces are universal: dissociation can encode association but not
conversely, but beyond association Chu spaces can concretely represent the
transformational behavior of any object of any small concrete category,
which is well beyond the means of any other extant notion of concrete
structure, including structures that combine topology and relations of
any arity, even infinitary.

My favorite way today of explaining how Chu spaces work is in terms of
crosswords.  A Chu space is a dictionary all of whose words have the same
length.  Two spaces interact (breed if you will) by one contributing the
horizontal words of a crossword puzzle solution, the other the vertical.
(There is an evident constraint on this breeding process: every letter
in the solution has to appear in two words one from each dictionary,
in the appropriate positions.)  The resulting Chu space, called the
*tensor product* A at B of the two dictionaries, has for its words the
solutions to this puzzle, whose common length is the area of the puzzle
(i.e. discard the rectangular structure).  It is NP-complete to decide if
the tensor product is consistent (contains any words), by NP-completeness
of crosswords, [Garey and Johnson, look for crosswords in the index].

A continuous function from Chu space A to Chu space B is a solution to the
puzzle whose vertical dictionary is A and whose horizontal dictionary is
the transpose B' of B (interchange the roles of word and position within
a word, i.e. a word from B' has all its letters from a fixed position in
words of B).  By taking the word positions as the carrier, and assuming
that B transposed understood as a dictionary has no repeated words, it
can be seen that a solution uniquely specifies a function from the word
positions of A to the word positions of B (i.e. to the words of B'):
just read off the words as they appear in the solution!

This is exactly the underlying principle of continuity for topological
spaces, the alphabet being {0,1}, word positions points, and words
open sets.

If we write A-oB for the Chu space whose word positions are the continuous
functions from A to B and whose words are (indexed by) the squares of
the puzzle, then the above can be summarized by the formula

	A -o B   =  ( A @ B')'

If one reads @ as conjunction, -o as implication, and ' as complement,
then this is recognizable as a logical law.  The other laws that hold
for this setup are exactly the laws of linear logic.

The notion of word used here is not completely standard with respect to
order of letters within a word, which may clarify the obvious difficulty
that arises with the notion of transposition of a dictionary: what is
the order of words in the dictionary that it determines the order of
letters in the transposed dictionary?  The answer is that there is no
globally specified order of either letters or words.  On the other hand
the position of letters within words is locally important in the sense
that when comparing two words of the same dictionary one should compare
corresponding letter positions.

This seemingly tricky notion is formalized very simply by defining a
dictionary, i.e. Chu space, to be two sets A and B indexing respectively
the word positions and the words, and a function r:AxB->K where K is the
alphabet.  Each word index b indexes the word whose a-th letter is r(a,b).
It turns out that crosswords are still NP-hard to solve when word order
is only local in this sense and not global (this is obvious when you
think about it, even without knowing how the proof goes), and in fact
the proof needs no modification.

See http://boole.stanford.edu/parikh for a recent introductory
paper of mine on Chu spaces (in html, not postscript).  See
http://boole.stanford.edu/chuguide.html for a list of twenty-odd
papers by my colleagues and me on them, with abstracts and links to
gzipped postscript.  For other papers see Altavista with "chu spaces"
(in quotes as shown) as the simple-search entry---the first paper that
will pop up is by Francois Lamarche in France, a very nice one on a
useful kind of Chu space that generalizes directed CPO's while still
forming a cartesian closed category.

Whether any of the above has anything to do with foundations seems to me
highly dependent on who's definition of foundations one uses.  I get
pretty tired of hearing that there's only one true definition of
foundations.

Vaughan Pratt



More information about the FOM mailing list