[FOM] identities
Insall, Matt
insall at umr.edu
Sat Jan 20 14:41:40 EST 2007
Martin,
Greetings, from Missouri.
I do not have a specific reference for you for exactly that problem, but I have one reference (mentioned below) from which I may be able to glean the answer to this question. I suggest that your friend look @ the universal algebra literature. Some authors who probably have investigated this and related questions are:
McKenzie
McNulty
Kearnes
Valeriote
Burris
Sankappanavar
Grätzer
& their co-authors
Specifically, I have seen talks presented by McKenzie and McKnulty, in which they have discussed unsolvable problems in universal algebra, I know that Kearnes has written papers dealing with logic in universal algebra, I know that Valeriote & McKenzie wrote a text called ''The Structure of Decidable Locally Finite Varieties'', and I know that Burris & Sankappanavar wrote a very nice text on Universal Algebra that explains Equational Logic [which is the ''Logic of Identities''] nicely. Grätzer's text on the subject is a classic tome, also.
The Burris & Sankappanavar text is available for download from Burris' web page. I have a copy of (McKenzie & Valeriote)'s text here with me, and I will try to figure out if the answer to your friend follows from what they present. (Note: In what follows, I sometimes paraphrase, and I sometimes quote exactly.)
On pages 8 & 9, they present the notion of a hereditarily undecidable theory: Let K be a class of models of a language L, and let T be its theory. Then K (o r T) is decidable iff for each finite sublanguage S of L, the intersection of wff(S) with T is recursive. The class K (or theory T) is hereditarily undecidable iff each class (of models of L) of which K is a subclass is itself undecidable (ie iff each subtheory of T is undecidable).
Now, in Chapter 1, MV explain that their text shows that ''...every locally finite variety that fails to be hereditarily undecidable decomposes as the product of three subvarieties which are, respectively, strongly Abelian, affine, and a discriminator variety.''.
Thus, we have the following corollary of their main result:
Let X be a set of variables, and let L be an equational language whose variables are all in X. Let V be the variety (also caIled an equational class, due to Birkhoff's Theorem) of all algebras which are models of L. If V has a locally finite subvariety K such that K does not decompose as the product of a strongly Abelian subvariety, an affine variety, and a discriminator variety, then V is undecidable, because K in this case is hereditarily undecidable.
Now, let me give you some thoughts on this without proofs [ and hence in some cases maybe wrong - though I hope not :-} ]:
1. If the set X is a singleton and L is finite, isn't V is decidable ? So in this trivial case, there should be a (trivial) algorithm for determining whether a given identity is implied by a given finite set of identities.
2. If X is infinite, then I suspect that for any nontrivial language L over X (ie a language that does not just have nullary operation symbols, and does not just have one function symbol that is unary), then V has at least one locally finite subvariety that does not decompose as a product of a strongly Abelian subvariety, an affine subvariety, and a discriminator subvariety. In this case, V is undecidable, and then there is no algorithm for determining whether a given identity is implied by a given finite set of identities.
3. If X is finite with lXl>1, then perhaps the answer depends even more severely on the number and arity of operation symbols in the language.
4. Since I have not tried to prove 1, 2, or 3, it is possible that I have overstated the necessary ingredients for undecidability. Maybe even for a lanyuage L with only one operation symbol which is unary, the variety V is undecidable, but my concern in this case is that in this type of langaage, what constitutes an expression is severely limited: only one variable can occur on either side of an identity, so only two variables can occur in any identity, so in this case, we may as well have only two variables. Doesn't this lead to decidability of the variety?
5. Probably you and your friend intended a sub case of case 2: X is denumerable, and L is nontrivial. If so, I think the answer to the question is ''No, there is no such algorithm.'', and the universal algebra literature I am pointing you to is likely to contain a proof of this.
If I come across a clearly stated theorem about this, together with a proof, I will let you know.
Matt
Dr. Matt Insall
Associate Professor of Mathematics
Department of Mathematics and Statistics
University of Missouri - Rolla
Rolla MO 65409-0020
(573)341-4901
insall at umr.edu
-----Original Message-----
An "identity" is a sentence that can be expressed as the
universal quaqntification of an ideintity. A friend has
asked whether there is an algorithm for determining
whether a given identity is implied by a given finite set
of identities. Surely this is known to be unsolvable.
References?
Martin
More information about the FOM
mailing list