Click Here ">
« October 2005 »
S M T W T F S
1
2 3 4 5 6 7 8
9 10 11 12 13 14 15
16 17 18 19 20 21 22
23 24 25 26 27 28 29
30 31
You are not logged in. Log in
Entries by Topic
All topics
Counterfactuals
defl@tionism
GENERAL LOGIC
HUMAN SEMANTICS
Interconnections  «
PARACONSISTENCY
Polemics
SCIENCE & NEWS
Cognition & Epistemology
Notes on Pirah?
Ontology&possible worlds
PRAGMATICS
PROPAEDEUTICS
Syn-Sem Interface
Temporal Logic
Blog Tools
Edit your Blog
Build a Blog
RSS Feed
View Profile
Translate this
INTO JAPANESE
BROTHER BLOG
MAIEUTIKOS
LINGUISTIX&LOGIK, Tony Marmo's blog
Monday, 17 October 2005

Topic: Interconnections

On Sense and Direct Reference


By Ben Caplan

Sense Millianism and Object Fregeanism both appeal to modes of presentation to solve one group of problems about one group of cases (namely, those that concern intuitions about the cognitive value of simple sentences, about the truth-value of some propositional-attitude ascriptions, or about sentences that contain empty names); and both appeal to objects or singular propositions to solve another group of problems about another group of cases (namely, those that concern intuitions about the truth-value of simple sentences, about the modal and epistemic profile of simple sentences, or about the truth-value of other propositional-attitude ascriptions). One further problem for both views is to explain, in a principled way, why modes of presentation matter in the first group of cases but not in the second; and, conversely, why objects or singular propositions matter in the second group of cases but not in the first. This further problem is, it seems, pressing and difficult for both views.

Source:
Online Papers in Philosophy

Posted by Tony Marmo at 07:58 BST
Updated: Monday, 17 October 2005 08:09 BST
Saturday, 8 October 2005

Topic: Interconnections

Semantic Conceptions of Information



A new article of the The Stanford Encyclopaedia of Philosophy has just been published:
Semantic Conceptions of Information
By Luciano Floridi

Posted by Tony Marmo at 19:34 BST
Updated: Saturday, 8 October 2005 19:37 BST
Sunday, 18 September 2005

Topic: Interconnections

Semantically Relatable Sets: Building Blocks for Representing Semantics


By Rajat Kumar Mohanty, Anupama Dutta and Pushpak Bhattacharyya

Motivated by the fact that ultimately, automatic language analysis is constituent detection and attachment resolution, we present our work on the problem of generating and linking semantically relatable sets (SRS) as a via media to automatic sentence analysis leading to semantics extraction. These sets are of the form <entity1, entity2> or <entity1 function-word entity2> or <function-word entity>, where the entities can be single words or more complex sentence parts (such as an embedded clause). The challenge lies in finding the components of these sets, which involves solving prepositional phrase (PP) and clause attachment problems, and empty pronominal (PRO) determination. Use is made of
(i) the parse tree of the sentence,
(ii) the subcategorization frames of lexical items,
(iii) the lexical properties of the words and
(iv) the lexical resources like the WordNet and the Oxford Advanced Learners’ Dictionary (OALD).

The components within the sets and the sets themselves are linked using the semantic relations of an interlingua for machine translation called the Universal Networking Language (UNL). The work forms part of a UNL based MT system, where the source language is analysed into semantic graphs and target language is generated from these graphs. The system has been tested on the Penn Treebank, and the results indicate the effectiveness of our approach.


Keywords: Semantically Relatable Sets, Syntactic and Semantic Constituents, Interlingua Based MT, Parse Trees, Lexical Properties, Argument Structure, Penn Treebank.

Source: Semantics Archive

Posted by Tony Marmo at 00:27 BST
Updated: Sunday, 18 September 2005 08:49 BST
Sunday, 4 September 2005

Now Playing: REPOSTED
Topic: Interconnections

Formal grammar and information theory: together again?


By Fernando Pereira

In the last forty years, research on models of spoken and written language has been split between two seemingly irreconcilable traditions: formal linguistics in the Chomsky tradition, and information theory in the Shannon tradition. Zellig Harris had advocated a close alliance between grammatical and information-theoretic principles in the analysis of natural language, and early formal-language theory provided another strong link between information theory and linguistics. Nevertheless, in most research on language and computation, grammatical and information theoretic approaches had moved far apart.
Today, after many years in the defensive, the information-theoretic approach has gained new strength and achieved practical successes in speech recognition, information retrieval, and, increasingly, in language analysis and machine translation. The exponential increase in the speed and storage capacity of computers is the proximate cause of these engineering successes, allowing the automatic estimation of the parameters of probabilistic models of language by counting occurrences of linguistic events in very large bodies of text and speech. However, I will also argue that information-theoretic and computational ideas are playing an increasing role in the scienti c understanding of language, and will help bring together formal-linguistic and information-theoretic perspectives.


Keywords: Formal linguistics; information theory; machine learning

See also the Baldwin Effect.
THE EVOLUTION OF LANGUAGE FACULTY: CLARIFICATIONS AND IMPLICATIONS
The Nature of the Language Faculty and its Implications for Evolution of Language
THREE FACTORS IN LANGUAGE DESIGN
The Fodor-Pinker Debate
Non-genomic nativism

Posted by Tony Marmo at 00:01 BST
Updated: Sunday, 4 September 2005 01:47 BST
Wednesday, 31 August 2005

Topic: Interconnections

THE POSITION OF SEMANTICS WITHIN CONTEMPORARY COGNITIVE SCIENCE


By Mihailo Antovic´

This paper provides an analysis of the importance of some present-day semantic theories for contemporary cognitive science. The question of the scope of cognitive science(s) is discussed, followed by a short overview of the study of linguistics in this multidisciplinary enterprise. Finally, three modern approaches to semantics within this framework are discussed (cognitive, truth-conditional and conceptual) and their advantages and disadvantages are briefly summarized. Conceptual semantics is singled out as a rather plausible approach to the study of meaning, even though it is often deemed of lesser importance by authoritative scholars. Some speculations as to the further development of semantics are hypothesized.

Key words: cognition, cognitive science, cognitive semantics, truth-conditional semantics, conceptual semantics

Posted by Tony Marmo at 00:01 BST
Updated: Wednesday, 31 August 2005 06:20 BST
Sunday, 21 August 2005

Topic: Interconnections

Post-Davidsonianism


By Gillian Ramchand

The pioneering work of Davidson (1967) gave rise to a productive and exciting tradition within formal semantics and especially at the interface between syntax and semantics, whereby event variables were exploited as elements of the referential ontology in the expression of the semantics of natural language. The existence of such logical elements (events, or eventuality variables) cannot seriously now be doubted, in my opinion, although many aspects of the formal theory of syntax and semantics have changed in the nearly forty years since Davidson’s seminal article. The time has definitely come for a more critical and nuanced understanding of the use of eventuality variables, in the light of recent research in the field. Maienborn (this volume) is an important example of this kind of work. She takes a new look at the idea of eventuality variables and argues that the case has been overstated, that there are both empirical and conceptual reasons for denying the existence of events in the Davidsonian sense for a certain class of statives and copular predications. I wish to show in this article that Maienborn both goes too far and not far enough in deconstructing the traditional Davidsonian assumptions. Instead, I propose a davidson inspired method of representation which fits better with current syntactic understanding, but which is liberated from some of the assumptions and methodologies of earlier work – I call this ‘Post-Davidsonianism’. I will argue that once one makes the adjustments in the Davidsonian tradition to make the idea coherent, Maienborn’s arguments for introducing a new ontological type (‘Kimian states’) into the system disappear.

Source: LingBuzz

Appeared in Theoretical Linguistics 31 (2005), 359–373

Posted by Tony Marmo at 01:35 BST
Updated: Sunday, 21 August 2005 01:57 BST
Monday, 15 August 2005

Topic: Interconnections

Typical Ambiguity: Trying to Have Your Cake and Eat it too.


By Solomon Feferman

Ambiguity is a property of syntactic expressions which is ubiquitous in all informal languages–natural, scientific and mathematical; the efficient use of language depends to an exceptional extent on this feature. Disambiguation is the process of separating out the possible meanings of ambiguous expressions. Ambiguity is typical if the process of disambiguation can be carried out in some systematic way. Russell made use of typical ambiguity in the theory of types in order to combine the assurance of its (apparent) consistency (“having the cake”) with the freedom of the informal untyped theory of classes and relations (“eating it too”). The paper begins with a brief tour of Russell’s uses of typical ambiguity, including his treatment of the statement Cls 2 Cls. This is generalized to a treatment in simple type theory of statements of the form A 2 B where A and B are class expressions for which A is prima facie of the same or higher type than B. In order to treat mathematically more interesting statements of self membership we then formulate a version of typical ambiguity for such statements in an extension of Zermelo-Fraenkel set theory. Specific attention is given to how the“naive” theory of categories can thereby be accounted for.

Appeared in the book One Hundred Years of Russell's Paradox (G. Link, ed.), Walter de Gruyter, Berlin (2004) 135-151

Posted by Tony Marmo at 07:01 BST
Updated: Monday, 15 August 2005 07:04 BST
Monday, 18 July 2005

Topic: Interconnections

The Gamut of Dynamic Logics


By Jan van Eijck & Martin Stokhof

Dynamic logic, broadly conceived, is the logic that analyses change by decomposing actions into their basic building blocks and by describing the results of performing actions in given states of the world. The actions studied by dynamic logic can be of various kinds: actions on the memory state of a computer, actions of a moving robot in a closed world, interactions between cognitive agents performing given communication protocols, actions that change the common ground between speaker and hearer in a conversation, actions that change the contextually available referents in a conversation, and so on.
In each of these application areas, dynamic logics can be used to model the states involved and the transitions that occur between them. Dynamic logic is a tool for both state description and action description. Formulae describe states, while actions or programs express state change. The levels of state descriptions and transition characterisations are connected by suitable operations that allow reasoning about pre- and post-conditions of particular changes.
From a computer science perspective, dynamic logic is a formal tool for reasoning about programs. Dynamic logics provides the means for formalising correctness specifications, for proving that these speci cations are met by a program under consideration, and for reasoning about equivalence of programs. From the perspective of the present paper, this is but one of many application areas. We will also look at dynamic logics for cognitive processing, for communication and information updating, and for various aspects of natural language understanding.


Source: Online Papers in Philosophy

Posted by Tony Marmo at 23:51 BST
Friday, 8 July 2005

Topic: Interconnections

Definability and Invariance


By Newton C. A. da Costa & Alexandre Augusto Martins Rodrigues

In his thesis Para uma Teoria Geral dos Homomorfismos (1944), the Portuguese mathematician José Sebastião e Silva constructed an abstract or generalized Galois theory, that is intimately linked to F. Klein's Erlangen Program and that foreshadows some notions and results of today's model theory; an analogous theory was independently worked out by M. Krasner in 1938. But Silva's work on the subject is neither wholly clear nor sufficiently rigorous. In this paper we present a rigorous version of the theory, correcting the shortcomings of Silva's exposition and extending some of its main results.

Source: CLE
Of related interest: Remarks on Abstract Galois Theory by Newton C.A. da Costa.

Posted by Tony Marmo at 02:41 BST
Updated: Friday, 8 July 2005 02:48 BST
Monday, 4 July 2005

Topic: Interconnections

The Elimination of Self-Reference
(Generalized Yablo-Series and the Theory of Truth)


By Phillippe Schlenker

Although it was traditionally thought that self-reference is a crucial ingredient of semantic paradoxes, Yablo showed that this is not so by displaying an infinite series of non-referential sentences which, taken together, are paradoxical (e.g. Yablo 2004). We generalize Yablo's result along two dimensions.
1. First, we investigate the behavior of Yablo-style series of the form {<s(i), [Qk: k> i] f[(s(k)) k≥i ]>: i≥0}, where for each i s( i) is a term that denotes the sentence [Qk: k> i] f[(s(k)) k≥i] ] (for some generalized quantifier Q and for some (fixed) truth function f). We show that for any n-valued compositional semantics and for any quantifier Q that satisfies certain natural properties, all the sentences in the series must have the same value. We derive a characterization of those values of Q for which the series is paradoxical in a natural trivalent logic.
2. Second, we show that in the Strong Kleene trivalent logic, Yablo's results are a special case of a much more general phenomenon: given certain assumptions, any semantic phenomenon that involves self-reference can be reproduced without self-reference (Cook 2004 proves a special case of this result, which only applies to logical paradoxes).

Specifically, we can associate to each pair <s, F> of a formula F named by a term s in a language L' a series of translations {<s( i), [Qk: k> i] [F] k>: i≥0} (where [F] kis a certain modification of F) in a quantificational language L* in such a way that
(i) none of the translations are self-referential,
(ii) in any fixed point I* of L*, all the translations of a given formula of L have the same value according to I*, and
(iii) there is a correspondence between the fixed points of L' and the fixed points of L* which ensures that the translations really do have the same semantic behavior as the sentences they translate.

We give a characterization of those generalized quantifiers Q which can be used in the translation.
Source: Online Papers in Philosophy

Posted by Tony Marmo at 07:15 BST
Updated: Monday, 4 July 2005 07:24 BST
Thursday, 2 June 2005

Topic: Interconnections

Many-valued logic vs. many-valued semantics


By Jaroslav Peregrin

Hence the task of the logician, viewed from this perspective, is the delimitation of the range of acceptable truth-valuations of the sentences of the given language – taking note of all the "lawful" features of the separation of true sentences from false ones. Let us call this the separation problem.
Consider the language of classical propositional calculus (and consequently the part of natural language which it purports to regiment). Here the ensuing "laws of truth" are quite transparent:
(i) ¬A is true iff A is not true
(ii) A^B is true iff A is true and B is true
(iii) A∨B is true iff A is true or B is true
(iv) A→B is true iff A is not true or B is true

Every truth-valuation which fulfills these constraints is acceptable and every acceptable truth-valuation does fulfill them. But the situation is, as is well known, not so simple once we abandon the calm waters of the classical propositional calculus.

Posted by Tony Marmo at 00:01 BST
Saturday, 14 May 2005

Topic: Interconnections

Counterfactuals and historical possibility


by Tomasz Placek and Thomas Muller

We show that truth conditions for counterfactuals need not always be given in terms of a vague notion of similarity. To this end, we single out the important class of historical counterfactuals and give formally rigorous truth conditions for these counterfactuals, employing a partial ordering relation called ``comparative closeness'' that is defined in the framework of branching space-times. Among other applications, we provide a detailed analysis of counterfactuals uttered in the context of lost bets. In an appendix we compare our theory with the branching space-times based reading of counterfactuals recently proposed by Belnap [1992].

Keywords: branching space-times, historical counterfactuals, comparative closeness,
betting


Source: PhilSci Archive, Online Papers in Philosophy

Posted by Tony Marmo at 00:01 BST
Updated: Sunday, 15 May 2005 08:14 BST
Wednesday, 20 April 2005

Topic: Interconnections

The Ken Hale Prize



Throughout his life Ken Hale has set an example of what competence means in the academic circle: a person of great skills using them to help others to construct positive things. He combined both ethics and expertise in his daily work.
Before his passage, he had asked that any donations in his memory be made to the Navajo Language Academy an organisation of Navajo scholars who he had trained.

I think that a Ken Hale Prize should be created for socially conscious linguists and/or scholars, and for those who have helped to promote and foster Linguistics, stimulating new linguists and the diffusion of the works of linguists.

Posted by Tony Marmo at 15:40 BST
Monday, 18 April 2005

Topic: Interconnections

WHY ARE HUMANS UNLIKE OTHER ANIMALS



What are we supposed to say when the answer to a question is the question itself? This seems to have been the very puzzle of the studies on human behaviour and nature.

Posted by Tony Marmo at 02:49 BST
Updated: Monday, 18 April 2005 13:21 BST
Tuesday, 29 March 2005

Now Playing: REPOSTED
Topic: Interconnections

How Does the Mind Work?
The Fodor-Pinker Debate



The current issue of Mind and Language contains an interesting open peer review cluster consisting of three articles:

1. So How Does the Mind Work? By Steven Pinker (Online publication date: 3-Feb-2005)


2. Reply to Steven Pinker 'So How Does The Mind Work?' By Jerry Fodor (Online publication date: 3-Feb-2005)
3. A Reply to Jerry Fodor on How the Mind Works By Steven Pinker (Online publication date: 3-Feb-2005)

Posted by Tony Marmo at 00:01 GMT
Updated: Tuesday, 29 March 2005 15:45 GMT

Newer | Latest | Older