Now Playing: REPOSTED
Formal grammar and information theory: together again?
By Fernando Pereira
In the last forty years, research on models of spoken and written language has been split between two seemingly irreconcilable traditions: formal linguistics in the Chomsky tradition, and information theory in the Shannon tradition. Zellig Harris had advocated a close alliance between grammatical and information-theoretic principles in the analysis of natural language, and early formal-language theory provided another strong link between information theory and linguistics. Nevertheless, in most research on language and computation, grammatical and information theoretic approaches had moved far apart.
Today, after many years in the defensive, the information-theoretic approach has gained new strength and achieved practical successes in speech recognition, information retrieval, and, increasingly, in language analysis and machine translation. The exponential increase in the speed and storage capacity of computers is the proximate cause of these engineering successes, allowing the automatic estimation of the parameters of probabilistic models of language by counting occurrences of linguistic events in very large bodies of text and speech. However, I will also argue that information-theoretic and computational ideas are playing an increasing role in the scienti c understanding of language, and will help bring together formal-linguistic and information-theoretic perspectives.
Keywords: Formal linguistics; information theory; machine learning
See also the Baldwin Effect.
THE EVOLUTION OF LANGUAGE FACULTY: CLARIFICATIONS AND IMPLICATIONS
The Nature of the Language Faculty and its Implications for Evolution of Language
THREE FACTORS IN LANGUAGE DESIGN
The Fodor-Pinker Debate
Posted by Tony Marmo at 00:01 BST
Updated: Sunday, 4 September 2005 01:47 BST