FOM: Re: Berkeley and nonstandard analysis
charles silver
silver_1 at mindspring.com
Fri Jan 28 11:57:01 EST 2000
>At 11:25 AM 1/27/00 -0800, Charles Silver wrote:
>> Doesn't non-standard analysis show that Berkeley was wrong and
Leibniz
>>was right?
Martin Davis:
>No. Berkeley's criticism was perfectly correct for the subject as it stood.
>He pointed out, for example, that the usual way to calculate derivatives
>first assumes a non-zero increment and then after a division by that
>increment, gets the result by setting that increment to 0. So two
>assumptions in direct contradiction were being used.
Let me try this out: Is it correct to say that Leibniz's infinitesimals
should *not* have been rejected on the basis of Berkeley's complaint that
they were "the ghosts of departed entities," but Berkeley's criticism that
the derivative was inconsistently determined was just? That is, Leibniz's
*concept* of the infinitesimal was sound, wasn't it, even though he made
errors of application? Or would you describe it another way? What I'm
thinking about is Leibniz's concept of the infinitesimal as a number smaller
than any normal number, such that.... It seems to me Robinson's work
vindicates this *concept*. Don't you think?
I have another question, which relates to what Mark Steiner said about
mathematicians not rushing around to get introductory calculus textbooks
that use infinitesimals. Isn't it true that it is *easier* and **clearer**
to teach calculus using infinitesimals than the standard way, assuming that
some theory is to be incorporated (which, I realize, many textbooks have now
abandoned)? That is, aren't the main reasons for the abandonment of
infinitesimals in the teaching of beginning calculus historical rather than
mathematical?
Charlie
More information about the FOM
mailing list