FOM: Do we need Hume's principle at all?

Dean Buckner Dean.Buckner at
Sun Mar 17 11:52:12 EST 2002

Why do we need Hume's principle?  Is it simply to justify the idea that the


expresses an identity, on the lines of "Clarendon = Hyde", so that "5+7" and
"12" are a sort of proper name?

This leads to all sorts of bad things.  For example

(1) that to every adjective there corresponds a proper name and hence some
sort of universal thing.  E.g we can write "Roses have redness" for "Roses
are red".

(2) That a number word is an adjective, as in "four men", "four musketeers",
so "four" must also correspond to some abstract entity or universal.

(3)  But what does the adjective qualify?  "Four" does not qualify
"musketeers" as "brave" does.

(4) So it must qualify the set of musketeers, the set being something
different from the four things considered as a collection.

(5) All "sets" that number four must therefore have some common property, in
the way that all roses have redness.  What could this be?

(6) Easy: define each number as equinumerosity with a certain benchmark set
which we can construct in a certain way.  We can begin with the empty set,
the set which contains objects not identical with themselves.  "zero" thus
designates the property common to all sets equinumerous with the empty set,
such as of mermaids, dodos, London plumbers &c.  To define equinumerosity of
course we need Hume's Principle [*]

(7)  So "four" is a concealed proper name of some sort.  Obvious, wasn't it?

This piles bad things on bad things.  The idea of universals, the idea that
from any adjective we can form a nominal expression, so that the expression
must name something.  The idea of sets.  The idea of the empty set.

As I've argued in earlier postings, why should we replace the simple and
elegant "there are no unicorns" with "the number of the set of unicrons is
the same as the number of the set of objects that are non-self identical"?
Why do mathematicians suppose this complicated substitution truly reflects
the form of "there are no unicorns", rather than being just a more
complicated way of saying something inherently simple?  Similarly why not

    There is at least one unicorn = not( there are no unicorns )
    There is exactly one unicorn = There is at least one unicorn and no
    There are at least two unicorns = There is one unicorn and at least one

and so on.  We don't need abstract objects (or "objects") of any sort, nor
sets, nor existence of anything to explain arithmetic.  Nor "Hume's
Principle" indeed.

We only require the logic of simple quantification, and the use of "other".

[*]  On whether "Hume's principle" is the right name for this:  Hume writes
(Treatise 1.3.1 "Of knowledge") -   "When two numbers are so combined, as
that the one always has an unit answering to every unit of the other, we
pronounce them equal".   A "unit" he defines in Treatise 1.2.2 ("On the
infinite divisibility of space and time") as being single things an account
of whose existence, the existence of a number of things depends.

Dean Buckner
4 Spencer Walk
London, SW15 1PL

Work 020 7676 1750
Home 020 8788 4273

More information about the FOM mailing list