[FOM] expressive power of natural languages

John Kadvany jkadvany at sbcglobal.net
Thu Dec 1 11:32:14 EST 2011

Though I'm not a linguist, here's what I've gleaned from some of the
literature on mathematical computation and natural language:
 - In the '50s, Chomsky noted already that natural language grammars would not
need the full power of generic Turing machines. So one framing of the issue is
in terms of computational power rather than set-theoretic power, as suggested
in the earlier FOM email. 

 - When Chomsky introduced transformations, that was shown by Rice and others
to be potentially the power of a universal Turing machine, hence too powerful
for linguistic models. There may have been miscommunication here between
linguists and mathematicians in terms of what counts as a natural language
grammar and the role of transformations. 

 - Generally, the complexity of natural languages derives from linguistic
intricacies (anaphor, 'long-distance' reference, 'movement', etc.), not
computational complexity. Pullum (of 'Eskimo snow' fame), Cullicover and
others have suggested that there isn't much need for complex logical machinery
to account for linguistic generativity. Context-sensitive rules (already
understood and formalized by ancient Indian linguists, cf. Staal) look to be
about the maximum complexity level needed. Cullicover argued in a review that
Pullum and Huddleston's Cambridge English Grammar (~1700pp, 2002) could
effectively be representative of sufficient implicit knowledge of the
language. I take this to mean that this large compendium of generic
constructions simply get 'cut and pasted' ('Merge' may be the term of art) to
yield all possible sentences. So the process is ultimately recursive and
decidable, but of such messy complexity that a single 'master formula' for
grammaticality is virtually impossible. My sense is also that Jackendoff, long
of the generative school, also sees much of linguistic complexity as being
handled by human cognition rather than being coded up via some mathematical

 - As an observation, many languages can be used to directly formulate
additive number words, akin to Roman numerals with a highest unit (e.g. '50s'
or '100s'). Positional multiplicative value was a big discovery, not obvious,
and perhaps requiring inscription rather than speech alone. So that's more
evidence that at least the 'natural' level of natural language complexity is
roughly that of additive arithmetic, and therefore falling short of general
computation including a single unbounded multiplcation operation. 

 - The linguist Dan Everett argued a couple years back that the Piraha
(Amazon) language is not even quite recursive. This discussion received a lot
of attention and there is literature on both sides of the debate.

John Kadvany 

-----Original Message-----
From: fom-bounces at cs.nyu.edu [mailto:fom-bounces at cs.nyu.edu] On Behalf Of
americanmcgeesfr at gmx.net
Sent: Wednesday, November 30, 2011 11:37 AM
To: fom at cs.nyu.edu
Subject: [FOM] expressive power of natural languages

Hello FOMers,

I was wondering if there is any (at least semi-)conclusive view about the
expressive power of a natural language like english resulting in a statement
like "whatever it is, it is a language of at least 2nd order". 
Of course, I know of Tarski´s comment suspecting natural languages to be
somehow (semantically) universal. But what I´m interested in is a hint
pointing me in a direction what to look for, i.e. is the fact that one
quantifies over classes in a natural language enough to label it higher order?
Can there be anything wrong to take it to be at least a many-sorted
first-order language?

Alex Nowak
FOM mailing list
FOM at cs.nyu.edu

More information about the FOM mailing list