Click Here ">
« August 2004 »
1 2 3 4 5 6 7
8 9 10 11 12 13 14
15 16 17 18 19 20 21
22 23 24 25 26 27 28
29 30 31
You are not logged in. Log in
Entries by Topic
All topics  «
Cognition & Epistemology
Notes on Pirah?
Ontology&possible worlds
Syn-Sem Interface
Temporal Logic
Blog Tools
Edit your Blog
Build a Blog
RSS Feed
View Profile
Translate this
LINGUISTIX&LOGIK, Tony Marmo's blog
Saturday, 14 August 2004


What does Paraconsistency do?

The case of belief revision

by Koji Tanaka

In this talk, I apply a paraconsistent logic to the Grove's sphere semantics, that is a model for the AGM theory of belief revision. Firstly, I examine the soundness of the paraconsistent sphere semantics with respect to the AGM postulates. Secondly, I discuss some differences between classical (AGM) and a paraconsistent approach. I then argue that the theory of belief revision that is based on paraconsistent logic is simple and elegant, and of universal use.
Download link

Posted by Tony Marmo at 06:44 BST
Updated: Saturday, 14 August 2004 06:50 BST

Descriptions and Beyond

Kent Bach, Kai von Fintel, Francois Recanati and others have published a new book about Descriptions.

Of related interest:

A Corpus-Based Investigation of Definite Description Use

by M. Poesio and R. Vieira

This paper presents the results of a study of definite descriptions use in written texts aimed at assessing the feasibility of annotating corpora with information about definite description interpretation.
Download link

Posted by Tony Marmo at 01:01 BST
Updated: Saturday, 14 August 2004 07:10 BST


Assertion and Denial, Commitment and Entitlement, and Incompatibility

by Greg Restall

In this short paper, I compare and contrast the kind of symmetricalist treatment of negation favoured in different ways by Huw Price (in "Why `Not'?") and by me (in "Multiple Conclusions") with Robert Brandom's analysis of scorekeeping in terms of commitment, entitlement and incompatibility.

Both kinds of account provide a way to distinguish the inferential significance of " A" and "A is warranted" in terms of a subtler analysis of our practices: on the one hand, we assert as well as deny; on the other, by distingushing downstream commitments from upstream entitlements and the incompatibility definable in terms of these. In this note I will examine the connections between these different approaches.


Posted by Tony Marmo at 01:01 BST
Thursday, 12 August 2004
Giannakidou on A puzzle about the Present Perfect

Until and the Present Perfect

Anastasia Giannakidou wrote a paper about the (im-)possibility of sentences combining a present perfect and an UNTIL connective, like the ones below from Greek and English. In her words, Until and its Greek counterpart mexri produce odd results when they modify an eventuality in the present perfect:
(1) *I Ariadne exi zisi sto Parisi mexri tora.
Ariadne has lived in Paris until now.

(2) *I Ariadne exi xasi ta klidia tis mexri tora.
Ariadne has lost her keys until now.

(3) *Ariadne has lived in Paris until 1998.

This is a puzzle in the light of two common assumptions that predict no incompatibility between the use of an UNTIL term and a form of Present Perfect:
(i) perfect eventualities denote result states (McCoard 1978, Dowty 1979, Vlach 1983, Kamp and Reyle 1993),
(ii) UNTIL is a stative modifier

I sense no such incompatibilities when using any equivalent Portuguese tenses in the first case:

(4) Ariadne viveu/ tem vivido/vem vivendo em Paris ate agora.

But in the second case there is a distinction between using the real participle and an adjective form:

(5) *Ariadne tem perdido (participle) as chaves ate agora.
(6) Ariadne tem perdidas (adjective) as chaves ate agora.

Perhaps, these are extra evidences to the known fact that not all languages have Perfect Tenses like English has.

See also this other post about Roumyana Pancheva's paper on the present perfect tense puzzle.

Posted by Tony Marmo at 06:43 BST
Updated: Thursday, 12 August 2004 06:48 BST
Wednesday, 11 August 2004
On the (non-)surprising paradoxes
Here are two related posts:

Paradox vs. Surprise

By Jon Kvanvig
Source: Certain Doubts

A paradox is different from a result that is merely surprising, but what is the difference? This question touches on matters beyond epistemology, but it is applicable to the major epistemic paradoxes, including preface, lottery, surprise quiz, and knowability. It is the latter that prompts my question.

In the knowability paradox, we purportedly demonstrate that if all truths are knowable, then all truths are known. There is no question that the result is surprising, but what makes it paradoxical? Compare it with Godel?s incompleteness theorems, for example, which are also quite surprising, but not paradoxical. Or compare it to the ontological argument, where it is purportedly shown that if a certain description is possibly exemplified, then it is necessarily exemplified. This, too, is quite surprising, but I doubt it is paradoxical. So, what is the difference?

Perhaps the difference is psychological. Logical results are suprising when they go beyond what we presently believe to be true, and when they concern issues of significance to us. We notice the result which, prior to the proof, we doubted; after seeing the proof, we are convinced and thereby surprised. When the result is paradoxical, however, something additional happens. The proof threatens our intellectual commitments in some way, it threatens our firmly held opinions on matters that are significant to us. Admitting the soundness of a proof to the contrary thus engenders a bit of mental apoplexy: we know something has to give, but it?s hard to see what.

Is there a different account of the distinction? I?m not sure; if you have a different account, please share it. But if we suppose that this account is on track, one has to dig a bit to find a paradox in the knowability result. The result is a conditional: if every truth is knowable, then every truth is known. That?s not a denial of any deeply entrenched viewpoint I hold on issues that are significant to me. So why the fuss? I think there is something paradoxical in the neighborhood here, and I think it has important lessons. But since it depends on the psychological account of the difference between surprising and paradoxical derivations, I?ll hold off to see if there might be a better account of the difference.

The Swedish Drill

By: Leon Felkins

Swedish civil defense authorities announced that a civil defense drill would be held one day the following week, but the actual day would be a surprise.
However, we can prove by induction that the drill cannot be held. Clearly, they cannot wait until Friday, since everyone will know it will be held that day. But if it cannot be held on Friday, then by induction it cannot be held on Thursday, Wednesday, or indeed on any day.

What is wrong with this proof?

This problem has generated a vast literature (see here). Several solutions of the paradox have been proposed, but as with most paradoxes
there is no consensus on which solution is the "right" one.

The earliest writers (O'Connor, Cohen, Alexander) see the announcement as simply a statement whose utterance refutes itself. If I tell you that I will have a surprise birthday party for you and then tell you all the details, including the exact time and place, then I destroy the surprise, refuting my statement that the birthday will be a surprise.

Soon, however, it was noticed that the drill could occur (say on Wednesday), and still be a surprise. Thus the announcement is vindicated instead of being refuted. So a puzzle remains.

One school of thought (Scriven, Shaw, Medlin, Fitch, Windt) interprets the announcement that the drill is unexpected as saying that the date of the drill cannot be deduced in advanced. This begs the question, deduced from which premises? Examination of the inductive argument shows that one of the premises used is the announcement itself, and in particular the fact that the drill is unexpected. Thus the word "unexpected" is defined circularly. Shaw and Medlin claim that this circularity is illegitimate and is the source of the paradox. Fitch uses Godelian techniques to produce a fully rigorous self-referential announcement, and shows that the resulting proposition is self-contradictory. However, none of these authors explain how it can be that this illegitimate or self-contradictory announcement nevertheless appears to be vindicated when the drill occurs. In other words, what they have shown is that under one interpretation of "surprise" the announcement is faulty, but their interpretation does not capture the intuition that the drill really is a surprise when it occurs and thus they are open to the charge that they have not captured the essence of the paradox.

Another school of thought (Quine, Kaplan and Montague, Binkley, Harrison, Wright and Sudbury, McClelland, Chihara, Sorenson) interprets surprise in terms of knowing instead of deducing. Quine claims that the victims of the drill cannot assert that on the eve of the last day they will know that the drill will occur on the next day. This blocks the inductive argument from the start, but Quine is not very explicit in showing what exactly is wrong with our strong intuition that everybody will "know" on the eve of the last day that the drill will occur on the following day. Later writers formalize the paradox using modal logic (a logic that attempts to represent propositions about knowing and believing) and suggest that various axioms about knowing are at fault, e.g., the axiom that if one knows something, then one knows that one knows it (the KK axiom). Sorenson, however, formulates three ingenious variations of the paradox that are independent of these doubtful axioms, and suggests instead that the problem is that the announcement involves a blindspot:

a statement that is true but which cannot be known by certain individuals even if they are presented with the statement.

This idea was foreshadowed by O'Beirne and Binkley. Unfortunately, a full discussion of how this blocks the paradox is beyond the scope of this summary.

Finally, there are two other approaches that deserve mention. Cargile interprets the paradox as a game between ideally rational agents and finds fault with the notion that ideally rational agents will arrive at the same conclusion independently of the situation they find themselves in. Olin interprets the paradox as an issue about justified belief: on the eve of the last day one cannot be justified in believing BOTH that the drill will occur on the next day AND that the drill will be a surprise even if both statements turn out to be true; hence the argument cannot proceed and the drill can be a surprise even on the last day.

For those who wish to read some of the literature, good papers to start with are Bennett-Cargile and both papers of Sorenson. All of these provide overviews of previous work and point out some errors, and so it's helpful to read them before reading the original papers. For further reading on the "deducibility" side, Shaw, Medlin and Fitch are good representatives. Other papers that are definitely worth reading are Quine, Binkley, and Olin.

Posted by Tony Marmo at 01:01 BST
Updated: Sunday, 15 August 2004 08:40 BST


Paraconsistency Everywhere

By Greg Restall

Paraconsistent logics are, by definition, inconsistency tolerant:
In a paraconsistent logic, inconsistencies need not entail everything. However, there is more than one way a body of information can be inconsistent. In this paper I distinguish contradictions from other inconsistencies, and I show that several different logics are, in an important sense, 'paraconsistent' in virtue of being inconsistency tolerant without thereby being contradiction tolerant. For example, even though no inconsistencies are tolerated by intuitionistic propositional logic, some inconsistencies are tolerated by intuitionistic predicate logic. In this way, intuitionistic predicate logic is, in a mild sense, paraconsistent. So too are orthologic and quantum propositional logic and other formal systems. Given this fact, a widespread view that traditional paraconsistent logics are especially repugnant because they countenance inconsistencies is undercut. Many well-understood nonclassical logics countenance inconsistencies as well.

Download link

Posted by Tony Marmo at 01:01 BST
Monday, 9 August 2004
Topic: Cognition & Epistemology

Isn't Tony multi-present? Why?

Children are interesting from both the epistemological and logical-philosophical points of view because they show how the mind works without a large stock of pre-conceived notions that a human gains as he grows old. For instance, I remember that years ago I liked to apply the following test to children who began to talk:

I asked them to check whether I was elsewhere or not, designating a certain place. They often went to the other place I indicated and called me several times. As they did not get any answers from me there, they came back and told me that I was not there.

As far as I suspect, this is an evidence that children have a natural highly logical way of thinking. In this case, as they had no a priori reason to assume that I am not a multi-present being, they would not think that I could not be at two different places at the same time.

Posted by Tony Marmo at 01:01 BST
Updated: Monday, 9 August 2004 08:28 BST


Metaphilosophical Pluralism and Paraconsistency:

From Orientative to Multi-level Pluralism

by Orellana Benado, Andr?s Bobenrieth, Carlos Verdugo

In a famous passage, Kant claimed that controversy and the lack of agreement in metaphysics--here understood as philosophy as a whole--was a `scandal.' Attempting to motivate his critique of pure reason, a project aimed at both ending the scandal and setting philosophy on the `secure path of science,' Kant endorsed the view that for as long as disagreement reigned sovereign in philosophy, there would be little to be learned from it as a science. The success of philosophy begins when controversy ends and culminates when the discipline itself as it has been known disappears. On the other hand, particularly in the second half of the twentieth century, many have despaired of the very possibility of philosophy constituting the search for truth, that is to say, a cognitive human activity, and constituting thus a source of knowledge. This paper seeks to sketch a research program that is motivated by an intuition that opposes both of these views.


Posted by Tony Marmo at 01:01 BST
Updated: Monday, 9 August 2004 08:22 BST

Context of Thought and Context of Utterance

(A Note on Free Indirect Discourse and the Historical Present)

by Philippe Schlenker


Based on the analysis of narrations in Free Indirect Discourse and the Historical Present, we argue (building in particular on Banfield 1982 and Doron 1991) that the grammatical notion of context of speech should be ramified into a Context of Thought and a Context of Utterance. Tense and person depend on the Context of Utterance, while all other indexicals (including here, now and the demonstratives) are evaluated with respect to the Context of Thought. Free Indirect Discourse and the Historical Present are analyzed as special combinatorial possibilities that arise when the two contexts are distinct, and exactly one of them is presented as identical to the physical point at which the sentence is articulated.


Posted by Tony Marmo at 00:01 BST
Updated: Monday, 9 August 2004 08:06 BST
Sunday, 8 August 2004

Topic: Cognition & Epistemology


I often like to play the following game in telling stories to people (either grown ups or children):

First-- I pick the title of a known tale and start telling another story with the characters of a third source. Eg.:

Puss in Boots

Once upon a time there lived three young sisters: Snow White, Goldilocks and Red Ridding Hood. Their father was a very good woodchopper who had married to an evil woman. Their stepmother made their lives miserable and forced them to do all the chores, while she kept practising her witchcraft. One day she put a spell on the the woodchopper and made him go to the market with his young daughters in order to sell them. 'We shall need the money to buy our victuals' she said. Then, the mesmerised man went to the market with his daughters...

Second-- In the middle of the story, I ask the hearers some unexpected question, like:

Who the three girls will meet on the road before they can get to the market? The Wolf or the Charming Prince?

People often got confused with this kind of game and made all sorts of guesses. It took a long time before they realised that there could be no right answer accordingly to common lore, because the story had been twisted from the beginning.

I have seen this kind of problem in scientific discussions too. People usually try to discuss the implications and the empirical testing methods employed to confirm or discard assumptions that were absurd from the start.

Indeed, people do not like to question premises, but are eager to have heated arguments on the consequences. Why?

Posted by Tony Marmo at 01:01 BST
Updated: Monday, 9 August 2004 07:41 BST

On King's Syntactic Evidence for Semantic Theory

by Brian Weatherson
Source: Thoughts Arguments and Rants 2/14/2003

I finally got around to reading Jeff King's paper on syntactic evidence for semantic theories, and I was struck by one of the examples he uses. At first I thought what he said was obviously wrong, but on reflection I think it might not be so obvious. (Most of the paper seems right to me, at least on a first reading through, but I didn't have anything detailed to say about those parts. Well, except to note that debates in philosophy of language are getting pointed nowadays.)

Anyway, here was the point that I think I disagree with. Jeff wants to argue that syntactic data can be sometimes used to support semantic theories. One example of this (not the only one) involves negative polarity items (NPIs). Examples of NPIs are ever and any when it is meant as an existential quantifier. It seems these words can only appear inside the scope of a negation, or in a context that behaves in some ways as if it were inside the scope of a negation.

Simplifying the argument a little bit, Jeff seems to suggest that the following argument could be used to provide support for its conclusion.

(1) NPIs are licenced in the antecedents of conditionals

(2) NPIs are only licenced in downwards entailing contexts

(3) The antecedent of a conditional is a downwards entailing context

A `downwards entailing context' is (roughly) one where replacing a more general term with a more specific term produces a logically weaker sentence. So while (3a) does not entail (3b), thus showing ordinary contexts are not downwards entailing, (3c) does entail (3d), showing negated contexts are not downwards entailing.

(3a) I will be given a birthday cake tomorrow.

(3b) I will be given a poisonous birthday cake tomorrow.

(3c) I will not be given a birthday cake tomorrow.

(3d) I will not be given a poisonous birthday cake tomorrow.

(I assume here that poisonous birthday cakes are still birthday cakes. I do hope that's true, or all my examples here will be no good.)

(2) was first proposed (to the best of my knowledge) in William Ladusaw's dissertation in I think 1979, and it has been revised a little since then, but many people I think hold that it is something like the right theory of when NPIs are licenced. But it does have one striking consequence: it implies (3). To give you a sense of how surprising (3) is, note that it implies that (4) entails (5).

(4) If I am given a birthday cake tomorrow, I will be happy.

(5) If I am given a poisonous birthday cake tomorrow, I will be happy.

Now, many people think that (4) could be true while (5) is false. It is certainly the case that there are contexts in which one could say (4) and not say (5). Perhaps the best explanation for that is pragmatic. Those who think that indicative conditionals are either material or strict implications will hold that it is pragmatic. But perhaps it is semantic. Officially, I think it is semantic, though I think the other side of the case has merit.

Here's where I think I disagree with Jeff. Imagine I am undecided about whether (4) really does entail (5). I think that the argument (1), (2) therefore (3) has no force whatsoever towards pushing me to think that it does. Rather, I think that only evidence to do with conditionals can tell in favour of the entailment of (5) by (4), and if that evidence is not sufficient to support the entailment claim, all the worse for premise (2).

At least, that was what I thought at first. On second thoughts, I think maybe I was a little too dogmatic here. On further review, though, I think my dogmatism was in the right direction. To test this, try a little thought experiment.

Imagine you think that all the evidence, except the evidence about conditionals, supports (2), or some slightly tidied up version of it. (This is not too hard to imagine I think, (2) does remarkably well at capturing most of the data.) And imagine that you think that while there are pragmatic explanations of the apparent counter-examples to If Fa then p entails If Fa and Ga then p , you think those explanations are fairly weak. (Again, not too hard to imagine.) Does the inductive evidence in favour of (2), which we acknowledge is substantial, and the obvious truth of (1) give you reason to take those pragmatic explanations more seriously, and more generally more reason to believe that If Fa then p does entail If Fa and Ga then p ? I still think no , but I can see why this might look like dogmatism to some.

I sometimes play at being a semanticist, but at heart I'm always a philosopher. And one of the occupational hazards of being a philosopher is that one takes methodological questions much more seriously than perhaps one ought. So at some level I care more about the methodological question raised in the last paragraph than I care about the facts about conditionals and NPIs. At that level, I'm rather grateful to Jeff for raising this question, because it's one of the harder methodological questions I think I've seen for a while.

Posted by Tony Marmo at 00:01 BST
Updated: Monday, 9 August 2004 07:57 BST
Kai von Fintel's reply to Weatherson's comments:

Your thoughts here are quite on target. One can take distributional/syntactic facts (NPI-licensing in conditional antecedents) as an argument for a semantic analysis (monotonic semantics for conditionals with additional epicycles). But one can also take semantic evidence (apparent entailment patterns) as an argument against a particular analysis of the distribution patterns (against the Fauconnier-Ladusaw theory of NPIs for example). So, there is a tension here between syntax and semantics, which is precisely why it is necessary to always do both of them: you can't be a semanticist without knowing a whole lot about syntax, and vice versa. On top of that, it is inevitable that one needs to take pragmatics into account. In the end, this kind of inquiry is part of a complex science and there are a lot of moving parts.

The particular fact of NPI-licensing in conditional antecedents has been a major focus of my own work on conditionals, see my two papers:

Counterfactuals in a Dynamic Context (2001) in Michael Kenstowicz (ed.) Ken Hale: A Life in Language, MIT Press. pp. 123-152.

NPI Licensing, Strawson Entailment, and Context Dependency (1999) Journal of Semantics, 16(2), pp. 97-148.

Source: von Fintel's blog

Posted by Tony Marmo at 00:01 BST
Updated: Monday, 9 August 2004 07:56 BST

Topic: Cognition & Epistemology


Knowledge and Stability

by Joe Shieber
June 08, 2004

Marc Moffett has been considering some interesting questions concerning knowledge and stable belief and justification at Close Range. In response to some probing questions, he submitted a follow-up post, including the following example :

The other day I was going out of town and was supposed to call some friends when I got into the airport. My wife wrote their number down and I glanced over it. As I was leaving, she reminded me to take the number. I said, 'I know it' and proceeded to recite it from memory. Knowing that the number was still fresh in my mind her response was, 'Do you really know it?'

Marc suggests that the example shows that knowledge sometimes requires not simply reliably-produced true belief (let's grant that the short-term memorial faculty allowing Marc to rattle off the number correctly is reliable), but stable belief, or stably justified belief. Marc claims that we have an intuitive grasp of stability and instability to which he can appeal in making this suggestion. However, and without meaning to be difficult, I still don't know what stability is; nevertheless, let's leave this problem aside.

What I want to do here is suggest an alternate diagnosis for Marc's example.Continue

Knowledge Discourses and Interaction Technology

by Carsten S?rensen & Masao Kakihara

Research within knowledge management tends to either overemphasize or underestimate the role of Information and Communication Technology (ICT). Furthermore, much of the ICT support debate has been shaped by the data-information- knowledge trichotomy and too focused on repository-based approaches.
We wish to engage in a principled debate concerning the character and role of knowledge technologies in contemporary organizational settings. The aim of this paper is to apply four perspectives on the management of knowledge to highlight four perspectives on technological options. The paper presents, based on four knowledge discourses --four interrelated perspectives on the management of knowledge-- four perspectives on ICT support for the management of knowledge each reviewing relevant literature and revealing a facet of how we can conceptualize the role of technology for knowledge management.
The four technology discourses focus on the: Production and distribution of information; interpretation and navigation of information; codification and embedding of collaboration; and establishment and maintenance of connections.Continue

The Relationship Between Knowledge and Understanding

by Michelle Jenkins

I've been thinking a lot lately about the relationship between knowledge and understanding. Knowledge and understanding, I think, are quite different sorts of things. My general grasp of the nature of understanding is influenced largely by the Ancients. One understands something if she 1) is able to provide a comprehensive explanation 2) has a systematic grasp of all of the information and 3) can defend her explanation against any questions or criticisms. First, in order to understand something I must be able to provide a comprehensive explanation of it. A physicist, for example, who understands the theory of relativity must be able to provide an explanation about why the theory of relativity is as it is, how it works, how it affects a variety of other physical laws and observations, and so forth. Second, to understand something, one must be able to `see' the relationship between different bits of information in the whole of the field to which the bit of information belongs. You must have a systematic grasp of the information relating to the matter at hand such that you see that the information, and the relationships that the different bits of information have with one another, forms an almost organic whole. Thus, a car mechanic who understands why the part of the car is making the sound that it is, has this understanding because he has a systematic grasp of the whole of the vehicle. He knows how the different parts relate to each other and how and in what ways certain conditions will affect both the different parts of the vehicle and the vehicle as a whole. This ties closely into the need for a comprehensive explanation. The physicist (or car mechanic) is able to provide a comprehensive explanation about the thing that she understands because she understands and `sees' the thing as a whole, as part of a complete system. Finally, in order to understand something, one must be able to defend her claim against any criticisms that are leveled against it. This defense must itself be explanatory. One cannot defend her view by pointing to the words of another, but must defend it by demonstrating an ability to look at the issue in a variety of ways and as part of a systematic whole. She is not proving her certainty with regard to an issue, but is demonstrating her understanding of the issue. In defending her view successfully, she demonstrates a reliability and stability within her account. Apparent in this account of understanding (I hope!) is that one must have a rather large web of information about the matter which one claims to understand. In order to develop and defend a suitably comprehensive explanation, one must be able to employ a huge number (and variety) of facts and bits of knowledge that relate to the thing she claims to understand. And, as the systematicity requirement shows, that web of information must be structured in a systematic manner.Continue

Not Every Truth Can Be Known:
at least, not all at once

According to the knowability thesis , every truth is knowable. Fitch's paradox refutes the knowability thesis by showing that if we are not omniscient, then not only are some truths not known, but there are some truths that are not knowable. In this paper, I propose a weakening of the knowability thesis (which I call the "conjunctive knowability thesis") to the effect that for every truth pthere is a collection of truths such that
(i) each of them is knowable and
(ii) their conjunction is equivalent to p.

I show that the conjunctive knowability thesis avoids triviality arguments against it, and that it fares very differently depending on one other issue connecting knowledge and possibility. If some things are knowable but false , then the conjunctive knowability thesis is trivially true. On the other hand, if knowability entails truth, the conjunctive knowability thesis is coherent, but only if the logic of possibility is quite weak.Continue

Some Thoughts About the Relationship Between Information and Understanding

Michael O. Luke
Paper to be presented at the American Society for Information Science Conference, San Diego, CA, May 20-22, 1996

That there is a relationship between information and understanding seems intuitively obvious. If we try to express this relationship mathematically, however, it soon becomes clear that the relationship is complex and mysterious. Knowing more about the connection, however, is important, not the least because we need more understanding as our world becomes faster paced and increasingly complex. The influence of increasing the amount of information, increasing the effectiveness of information mining tools and ways of organizing information to aid the cognitive process are briefly discussed.Continue

Posted by Tony Marmo at 00:01 BST
Updated: Monday, 9 August 2004 08:20 BST
Monday, 2 August 2004


Constructions and Formal Semantics

by Marc Moffett
Source: Close Range June 27, 2004

I have been arguing, for instance in my dissertation, that the correctness of Construction Grammar is pretty much uncontroversial. The point, basically, is that no one has ever proposed a semantic theory for even a simple language that doesn't assume the existence of at least one linguistic construction, usually the subject-predicate construction. So in my view those guys over in Berkeley (and their followers) are on pretty solid ground. (The only way I can see to avoid this conclusion is to argue that predication, or function-application, isn't part of the semantics, but something extra.)

Unfortunately, in virtue of not taking explicit account of the role of constructions in their philosophical semantics, philosophers (and linguistic semanticists) philosophers of language have been led to, in my estimation, very implausible linguistic theses. My personal bugbear is the doctrine of logical forms, construed as a linguistic thesis. I want to be clear here that, although I am not convinced of the need for a level LF in syntax, that notion of logical form is far too weak to do the sort of work required by the sorts of robust semantic analyses posited these days. (Think, for instance, of the neoDavidsonian analysis of eventive sentences!) In order to accomodate these robust semantic analyses, the underlying logical forms would have to be vastly more complex than can be reasonably motivated on purely syntactic grounds.

So why have so many philosophers been suckered into accepting them? I'm not sure, but I wonder if it doesn't arise in part from an implicit acceptance of the Fregean view of the language-proposition relation. According to Frege (or, at least, Dummett's Frege), our only cognitive access onto propositions is via the linguistic structure of the sentences that express them. If Frege's Thesis is correct, then the need for a robust semantics will require a correspondingly complex underlying linguistic structure.

[It is also worth considering, in this quasi-historical context, whether or not Russell's notion of contextual definition and the associated doctrine of "incomplete symbols" doesn't mark out an inchoate construction-based theory of language.]


Jason Stanley

Your question should be -- why have so many *linguists* been suckered into accepted logical form, with rich covert syntactic structures. Once the point is put in this more adequate manner, it becomes clear you're being more than a little dogmatic.
Those philosophers who do accept rich logical forms do so, because, in taking syntax classes for many years, we've been introduced to the notion of a rich logical form with lots of covert structure (is Richard Larson in a philosophy department? Is Chomsky in a philosophy department? Pesetsky?). Robert May's book on logical form in the 1980's had a big impact on syntax and semantics, and many of us who started doing linguistics then were doing GB, and read that book. Minimalist syntax makes different assumptions than GB, and seeks to explain different evidence. But, if anything, it postulates much more covert structure.
In my experience, it's *philosophers* who are reluctant to buy linguistic arguments for covert structures.

Part of the problem has to do with what's meant by "purely syntactic grounds". If what you mean is, on the basis of judgements of grammaticality and ungrammaticality alone, then that is simply an oversimplistic conception of "purely syntactic grounds". For example, we distinguish bound vs. free readings of pronouns not on the grounds of grammaticality, but on the grounds that they give rise to different readings. We appeal to different potentential attachment sites of modifiers as arguments for underlying constituent structures. And so on -- so your post assumes some conception of "purely syntactic grounds" that is overly philosophical in nature.

Posted by Tony Marmo at 17:15 BST
Updated: Monday, 9 August 2004 07:38 BST
Wednesday, 28 July 2004




Edited by Aleksandar Jokic &Varieties of Meaning
The 2002 Jean Nicod Lectures
By Ruth Garrett Millikan

Many different things are said to have meaning: people mean to do various things; tools and other artifacts are meant for various things; people mean various things by using words and sentences; natural signs mean things; representations in people's minds also presumably mean things. In Varieties of Meaning, Ruth Garrett Millikan argues that these different kinds of meaning can be understood only in relation to each other.

What does meaning in the sense of purpose (when something is said to be meant for something) have to do with meaning in the sense of representing or signifying? Millikan argues that the explicit human purposes, explicit human intentions, are represented purposes. They do not merely represent purposes; they possess the purposes that they represent. She argues further that things that signify, intentional signs such as sentences, are distinguished from natural signs by having purpose essentially; therefore, unlike natural signs, intentional signs can misrepresent or be false.

Part I discusses "Purposes and Cross-Purposes" -- what purposes are, the purposes of people, of their behaviors, of their body parts, of their artifacts, and of the signs they use. Part II then describes a previously unrecognized kind of natural sign,
"locally recurrent" natural signs, and several varieties of intentional signs, and discusses the ways in which representations themselves are represented. Part III offers a
novel interpretation of the way language is understood and of the relation between semantics and pragmatics. Part IV discusses perception and thought, exploring stages in the development of inner representations, from the simplest organisms whose behavior is governed by perception-action cycles to the perceptions and intentional attitudes of humans.
Quentin Smith

Among the many branches of philosophy, the philosophy of time and the philosophy of language are more intimately interconnected than most, yet their practitioners have long pursued independent paths. This book helps to bridge the gap between the two groups. As it makes clear, it is increasingly difficult to do philosophy of language without any metaphysical commitments as to the nature of time, and it is equally difficult to resolve the metaphysical question of whether time is tensed or tenseless independently of the philosophy of language. Indeed, one is tempted to see philosophy of language and metaphysics as a continuum with no sharp boundary.

The essays, which were written expressly for this book by leading philosophers of language and philosophers of time, discuss the philosophy of language and its implications for the philosophy of time and vice versa. The intention is not only to
further dialogue between philosophers of language and of time but also to present new theories to advance the state of knowledge in the two fields. The essays are organized in two sections -- one on the philosophy of tensed language, the other
on the metaphysics of time.




Edited by Jacqueline Gueron and Jacqueline Lecarme

Any analysis of the syntax of time is based on a paradox: it must include a syntax-based theory of both tense construal and event construal. Yet while time is undimensional, events have a complex spatiotemporal structure that reflects their human participants. How can an event be flattened to fit into the linear time axis?
Chomsky's The Minimalist Program, published in 1995, offers a way to address this problem. The studies collected in The Syntax of Time investigate whether problems concerning the construal of tense and aspect can be reduced to syntactic problems for which the basic mechanism and principles of generative grammar already provide solutions.

These studies, recent work by leading international scholars in the field,offer varied perspectives on the syntax of tense and the temporal construal of events: models of tense interpretation, construal of verbal forms, temporal aspect versus lexical aspect, the relation between the event and its argument structure, and the interaction of case with aktionsart or tense construal. Advances in the theory of temporal interpretation in the sentence are also applied to the temporal interpretation of nominals.


Posted by Tony Marmo at 13:01 BST
Updated: Monday, 9 August 2004 07:44 BST

Newer | Latest | Older