[FOM] expressive power of natural languages

Patrik Eklund peklund at cs.umu.se
Fri Dec 2 03:40:34 EST 2011


Is it really a question about the expressive power of (natural) language, 
or expressive power of individuals using a language of their choice?

---

I believe (at least) in the following two things:

I. Every individual uses a language of their choice. If the language is 
logical, individuals can provide reasoning, and potentially also 
dialectics. If not, no reasoning is possible, and everything is basically 
individual rhethorics (and all operators used in the underlying signature 
are basically of zero arity).

To be a bit more formal, suppose these languages individually adopted are 
logical, and two individuals communicate, using their own individual 
logic. An individual arrives at or derives a sentence delivering it as 
true (in the sense of the semantics as understood or selected by the 
individual for her/his logic) to the other individual. There must now be 
some kind of transformation between these two logics, so that the sentence 
is transformed from one to the other. The transformed sentence so received 
by the other individual is up for evaluation in the theory embraced by 
that individual. It may even happen that the transformed sentence is not 
even valid as a sentence in the other logic, so the other individual would 
reply "I don't even understand what you are trying to say". If the 
transformed sentence indeed is valid as a sentence, but not necessarily as 
true in the other logic as it is, untransformed, in the first logic, a 
dialectic may be intitiated, where sentences, back and forth tranformed 
are delivered within some communication pattern.

The interesting thing here is the question about who owns the 
transformation. The respective logic is clearly owned by each individual, 
but transformations used are subject to dynamics, as is is respective 
logics. Nothing basically says anybody has to stay within a logic adopted 
sometimes in the past. Quite on the contrary, we usually do whatever it 
takes to maintain our views to be the valid and true ones.

Social choice for instance can be extended in these views.

II. Now one may claim everything is basically some-order from start, so 
that whatever we do it can be boiled down to some-order expressions. 
Hilbert tried it, and some (!) still believe in his thesis. I neither do 
or don't. I believe on metalanguage and object language, and actually in 
the respectful relation between the two, and that we should not be 
allowed to switch back and forth (some call it being self-referential) 
and even by-pass as we please. There is a some-order for mathematics, 
and there is a set theory for mathematicians. From formal theory point of 
view, they developed hand in hand, as also Hilbert said.

Now, on this apparatus we can place e.g. category theory as a language, 
and making it a metalanguage for logic as an object language. My 
favourite is then the Goguen and Meseguer approaches to institutions and 
entailment systems, a formalism that, by-the-way, can be further 
generalized.

Developing signatures and terms, and then closing that shop, then 
developing sentences based on terms, and closing that shop, and so on, 
step by step defining your logic, places some interesting restrictions 
e.g. on sentence creation that e.g. Gödel wouldn't like at all.

Not encapsulating "first-order for mathematics and set theory" into what 
it really is intended for, and allowing it to embrace everything and 
enable self-referentiality is a huge mistake of mankind.

---

On Chomsky I would say it's not logical. It's language in the logically 
restricted sense of involving regular expressions, so basically there are 
terms, and some may say there are sentences. But there is no inference 
calculi in these views of "language".

Connecting "natural language", as 'language' only in the sense of 
being part of "automata and languages" is historically a huge mistake in 
computer science.

---

Best regards,

Patrik



On Thu, 1 Dec 2011, John Kadvany wrote:

> Though I'm not a linguist, here's what I've gleaned from some of the
> literature on mathematical computation and natural language:
>
> - In the '50s, Chomsky noted already that natural language grammars would not
> need the full power of generic Turing machines. So one framing of the issue is
> in terms of computational power rather than set-theoretic power, as suggested
> in the earlier FOM email.
>
> - When Chomsky introduced transformations, that was shown by Rice and others
> to be potentially the power of a universal Turing machine, hence too powerful
> for linguistic models. There may have been miscommunication here between
> linguists and mathematicians in terms of what counts as a natural language
> grammar and the role of transformations.
>
> - Generally, the complexity of natural languages derives from linguistic
> intricacies (anaphor, 'long-distance' reference, 'movement', etc.), not
> computational complexity. Pullum (of 'Eskimo snow' fame), Cullicover and
> others have suggested that there isn't much need for complex logical machinery
> to account for linguistic generativity. Context-sensitive rules (already
> understood and formalized by ancient Indian linguists, cf. Staal) look to be
> about the maximum complexity level needed. Cullicover argued in a review that
> Pullum and Huddleston's Cambridge English Grammar (~1700pp, 2002) could
> effectively be representative of sufficient implicit knowledge of the
> language. I take this to mean that this large compendium of generic
> constructions simply get 'cut and pasted' ('Merge' may be the term of art) to
> yield all possible sentences. So the process is ultimately recursive and
> decidable, but of such messy complexity that a single 'master formula' for
> grammaticality is virtually impossible. My sense is also that Jackendoff, long
> of the generative school, also sees much of linguistic complexity as being
> handled by human cognition rather than being coded up via some mathematical
> representation.
>
> - As an observation, many languages can be used to directly formulate
> additive number words, akin to Roman numerals with a highest unit (e.g. '50s'
> or '100s'). Positional multiplicative value was a big discovery, not obvious,
> and perhaps requiring inscription rather than speech alone. So that's more
> evidence that at least the 'natural' level of natural language complexity is
> roughly that of additive arithmetic, and therefore falling short of general
> computation including a single unbounded multiplcation operation.
>
> - The linguist Dan Everett argued a couple years back that the Piraha
> (Amazon) language is not even quite recursive. This discussion received a lot
> of attention and there is literature on both sides of the debate.
>
> John Kadvany
>
> -----Original Message-----
> From: fom-bounces at cs.nyu.edu [mailto:fom-bounces at cs.nyu.edu] On Behalf Of
> americanmcgeesfr at gmx.net
> Sent: Wednesday, November 30, 2011 11:37 AM
> To: fom at cs.nyu.edu
> Subject: [FOM] expressive power of natural languages
>
> Hello FOMers,
>
> I was wondering if there is any (at least semi-)conclusive view about the
> expressive power of a natural language like english resulting in a statement
> like "whatever it is, it is a language of at least 2nd order".
> Of course, I know of Tarski´s comment suspecting natural languages to be
> somehow (semantically) universal. But what I´m interested in is a hint
> pointing me in a direction what to look for, i.e. is the fact that one
> quantifies over classes in a natural language enough to label it higher order?
> Can there be anything wrong to take it to be at least a many-sorted
> first-order language?
>
> Thanks
> Alex Nowak
> _______________________________________________
> FOM mailing list
> FOM at cs.nyu.edu
> http://www.cs.nyu.edu/mailman/listinfo/fom
>
> _______________________________________________
> FOM mailing list
> FOM at cs.nyu.edu
> http://www.cs.nyu.edu/mailman/listinfo/fom
>


More information about the FOM mailing list