Click Here ">
« August 2004 »
S M T W T F S
1 2 3 4 5 6 7
8 9 10 11 12 13 14
15 16 17 18 19 20 21
22 23 24 25 26 27 28
29 30 31
You are not logged in. Log in
Entries by Topic
All topics  «
Counterfactuals
defl@tionism
GENERAL LOGIC
HUMAN SEMANTICS
Interconnections
PARACONSISTENCY
Polemics
SCIENCE & NEWS
Cognition & Epistemology
Notes on Pirah?
Ontology&possible worlds
PRAGMATICS
PROPAEDEUTICS
Syn-Sem Interface
Temporal Logic
Blog Tools
Edit your Blog
Build a Blog
RSS Feed
View Profile
Translate this
INTO JAPANESE
BROTHER BLOG
MAIEUTIKOS
LINGUISTIX&LOGIK, Tony Marmo's blog
Monday, 9 August 2004
HUMANS ARE BORN LOGIC
Topic: Cognition & Epistemology

Isn't Tony multi-present? Why?


Children are interesting from both the epistemological and logical-philosophical points of view because they show how the mind works without a large stock of pre-conceived notions that a human gains as he grows old. For instance, I remember that years ago I liked to apply the following test to children who began to talk:

I asked them to check whether I was elsewhere or not, designating a certain place. They often went to the other place I indicated and called me several times. As they did not get any answers from me there, they came back and told me that I was not there.

As far as I suspect, this is an evidence that children have a natural highly logical way of thinking. In this case, as they had no a priori reason to assume that I am not a multi-present being, they would not think that I could not be at two different places at the same time.

Posted by Tony Marmo at 01:01 BST
Updated: Monday, 9 August 2004 08:28 BST

Topic: PARACONSISTENCY

Metaphilosophical Pluralism and Paraconsistency:


From Orientative to Multi-level Pluralism


by Orellana Benado, Andr?s Bobenrieth, Carlos Verdugo

In a famous passage, Kant claimed that controversy and the lack of agreement in metaphysics--here understood as philosophy as a whole--was a `scandal.' Attempting to motivate his critique of pure reason, a project aimed at both ending the scandal and setting philosophy on the `secure path of science,' Kant endorsed the view that for as long as disagreement reigned sovereign in philosophy, there would be little to be learned from it as a science. The success of philosophy begins when controversy ends and culminates when the discipline itself as it has been known disappears. On the other hand, particularly in the second half of the twentieth century, many have despaired of the very possibility of philosophy constituting the search for truth, that is to say, a cognitive human activity, and constituting thus a source of knowledge. This paper seeks to sketch a research program that is motivated by an intuition that opposes both of these views.

Link

Posted by Tony Marmo at 01:01 BST
Updated: Monday, 9 August 2004 08:22 BST
UTTERANCE AND THOUGHT CONTEXTS
Topic: HUMAN SEMANTICS

Context of Thought and Context of Utterance

(A Note on Free Indirect Discourse and the Historical Present)


by Philippe Schlenker

Abstract

Based on the analysis of narrations in Free Indirect Discourse and the Historical Present, we argue (building in particular on Banfield 1982 and Doron 1991) that the grammatical notion of context of speech should be ramified into a Context of Thought and a Context of Utterance. Tense and person depend on the Context of Utterance, while all other indexicals (including here, now and the demonstratives) are evaluated with respect to the Context of Thought. Free Indirect Discourse and the Historical Present are analyzed as special combinatorial possibilities that arise when the two contexts are distinct, and exactly one of them is presented as identical to the physical point at which the sentence is articulated.



link

Posted by Tony Marmo at 00:01 BST
Updated: Monday, 9 August 2004 08:06 BST
Sunday, 8 August 2004

Topic: Cognition & Epistemology

TO QUESTION PREMISES OR NOT TO QUESTION THEM



I often like to play the following game in telling stories to people (either grown ups or children):

First-- I pick the title of a known tale and start telling another story with the characters of a third source. Eg.:

Puss in Boots

Once upon a time there lived three young sisters: Snow White, Goldilocks and Red Ridding Hood. Their father was a very good woodchopper who had married to an evil woman. Their stepmother made their lives miserable and forced them to do all the chores, while she kept practising her witchcraft. One day she put a spell on the the woodchopper and made him go to the market with his young daughters in order to sell them. 'We shall need the money to buy our victuals' she said. Then, the mesmerised man went to the market with his daughters...


Second-- In the middle of the story, I ask the hearers some unexpected question, like:

Who the three girls will meet on the road before they can get to the market? The Wolf or the Charming Prince?


People often got confused with this kind of game and made all sorts of guesses. It took a long time before they realised that there could be no right answer accordingly to common lore, because the story had been twisted from the beginning.

I have seen this kind of problem in scientific discussions too. People usually try to discuss the implications and the empirical testing methods employed to confirm or discard assumptions that were absurd from the start.

Indeed, people do not like to question premises, but are eager to have heated arguments on the consequences. Why?

Posted by Tony Marmo at 01:01 BST
Updated: Monday, 9 August 2004 07:41 BST
SYNTAX AND SEMANTICS
Topic: HUMAN SEMANTICS

On King's Syntactic Evidence for Semantic Theory


by Brian Weatherson
Source: Thoughts Arguments and Rants 2/14/2003


I finally got around to reading Jeff King's paper on syntactic evidence for semantic theories, and I was struck by one of the examples he uses. At first I thought what he said was obviously wrong, but on reflection I think it might not be so obvious. (Most of the paper seems right to me, at least on a first reading through, but I didn't have anything detailed to say about those parts. Well, except to note that debates in philosophy of language are getting pointed nowadays.)

Anyway, here was the point that I think I disagree with. Jeff wants to argue that syntactic data can be sometimes used to support semantic theories. One example of this (not the only one) involves negative polarity items (NPIs). Examples of NPIs are ever and any when it is meant as an existential quantifier. It seems these words can only appear inside the scope of a negation, or in a context that behaves in some ways as if it were inside the scope of a negation.

Simplifying the argument a little bit, Jeff seems to suggest that the following argument could be used to provide support for its conclusion.

(1) NPIs are licenced in the antecedents of conditionals

(2) NPIs are only licenced in downwards entailing contexts

(3) The antecedent of a conditional is a downwards entailing context

A `downwards entailing context' is (roughly) one where replacing a more general term with a more specific term produces a logically weaker sentence. So while (3a) does not entail (3b), thus showing ordinary contexts are not downwards entailing, (3c) does entail (3d), showing negated contexts are not downwards entailing.

(3a) I will be given a birthday cake tomorrow.

(3b) I will be given a poisonous birthday cake tomorrow.

(3c) I will not be given a birthday cake tomorrow.

(3d) I will not be given a poisonous birthday cake tomorrow.

(I assume here that poisonous birthday cakes are still birthday cakes. I do hope that's true, or all my examples here will be no good.)

(2) was first proposed (to the best of my knowledge) in William Ladusaw's dissertation in I think 1979, and it has been revised a little since then, but many people I think hold that it is something like the right theory of when NPIs are licenced. But it does have one striking consequence: it implies (3). To give you a sense of how surprising (3) is, note that it implies that (4) entails (5).

(4) If I am given a birthday cake tomorrow, I will be happy.

(5) If I am given a poisonous birthday cake tomorrow, I will be happy.

Now, many people think that (4) could be true while (5) is false. It is certainly the case that there are contexts in which one could say (4) and not say (5). Perhaps the best explanation for that is pragmatic. Those who think that indicative conditionals are either material or strict implications will hold that it is pragmatic. But perhaps it is semantic. Officially, I think it is semantic, though I think the other side of the case has merit.

Here's where I think I disagree with Jeff. Imagine I am undecided about whether (4) really does entail (5). I think that the argument (1), (2) therefore (3) has no force whatsoever towards pushing me to think that it does. Rather, I think that only evidence to do with conditionals can tell in favour of the entailment of (5) by (4), and if that evidence is not sufficient to support the entailment claim, all the worse for premise (2).

At least, that was what I thought at first. On second thoughts, I think maybe I was a little too dogmatic here. On further review, though, I think my dogmatism was in the right direction. To test this, try a little thought experiment.

Imagine you think that all the evidence, except the evidence about conditionals, supports (2), or some slightly tidied up version of it. (This is not too hard to imagine I think, (2) does remarkably well at capturing most of the data.) And imagine that you think that while there are pragmatic explanations of the apparent counter-examples to If Fa then p entails If Fa and Ga then p , you think those explanations are fairly weak. (Again, not too hard to imagine.) Does the inductive evidence in favour of (2), which we acknowledge is substantial, and the obvious truth of (1) give you reason to take those pragmatic explanations more seriously, and more generally more reason to believe that If Fa then p does entail If Fa and Ga then p ? I still think no , but I can see why this might look like dogmatism to some.

I sometimes play at being a semanticist, but at heart I'm always a philosopher. And one of the occupational hazards of being a philosopher is that one takes methodological questions much more seriously than perhaps one ought. So at some level I care more about the methodological question raised in the last paragraph than I care about the facts about conditionals and NPIs. At that level, I'm rather grateful to Jeff for raising this question, because it's one of the harder methodological questions I think I've seen for a while.


Posted by Tony Marmo at 00:01 BST
Updated: Monday, 9 August 2004 07:57 BST
SYNTAX AND SEMANTICS
Topic: HUMAN SEMANTICS
Kai von Fintel's reply to Weatherson's comments:

Your thoughts here are quite on target. One can take distributional/syntactic facts (NPI-licensing in conditional antecedents) as an argument for a semantic analysis (monotonic semantics for conditionals with additional epicycles). But one can also take semantic evidence (apparent entailment patterns) as an argument against a particular analysis of the distribution patterns (against the Fauconnier-Ladusaw theory of NPIs for example). So, there is a tension here between syntax and semantics, which is precisely why it is necessary to always do both of them: you can't be a semanticist without knowing a whole lot about syntax, and vice versa. On top of that, it is inevitable that one needs to take pragmatics into account. In the end, this kind of inquiry is part of a complex science and there are a lot of moving parts.

The particular fact of NPI-licensing in conditional antecedents has been a major focus of my own work on conditionals, see my two papers:


Counterfactuals in a Dynamic Context (2001) in Michael Kenstowicz (ed.) Ken Hale: A Life in Language, MIT Press. pp. 123-152.

NPI Licensing, Strawson Entailment, and Context Dependency (1999) Journal of Semantics, 16(2), pp. 97-148.


Source: von Fintel's blog

Posted by Tony Marmo at 00:01 BST
Updated: Monday, 9 August 2004 07:56 BST

Topic: Cognition & Epistemology

SUMMARY OF THIS BLOG



Knowledge and Stability



by Joe Shieber
June 08, 2004


Marc Moffett has been considering some interesting questions concerning knowledge and stable belief and justification at Close Range. In response to some probing questions, he submitted a follow-up post, including the following example :

The other day I was going out of town and was supposed to call some friends when I got into the airport. My wife wrote their number down and I glanced over it. As I was leaving, she reminded me to take the number. I said, 'I know it' and proceeded to recite it from memory. Knowing that the number was still fresh in my mind her response was, 'Do you really know it?'


Marc suggests that the example shows that knowledge sometimes requires not simply reliably-produced true belief (let's grant that the short-term memorial faculty allowing Marc to rattle off the number correctly is reliable), but stable belief, or stably justified belief. Marc claims that we have an intuitive grasp of stability and instability to which he can appeal in making this suggestion. However, and without meaning to be difficult, I still don't know what stability is; nevertheless, let's leave this problem aside.

What I want to do here is suggest an alternate diagnosis for Marc's example.Continue


Knowledge Discourses and Interaction Technology



by Carsten S?rensen & Masao Kakihara

Research within knowledge management tends to either overemphasize or underestimate the role of Information and Communication Technology (ICT). Furthermore, much of the ICT support debate has been shaped by the data-information- knowledge trichotomy and too focused on repository-based approaches.
We wish to engage in a principled debate concerning the character and role of knowledge technologies in contemporary organizational settings. The aim of this paper is to apply four perspectives on the management of knowledge to highlight four perspectives on technological options. The paper presents, based on four knowledge discourses --four interrelated perspectives on the management of knowledge-- four perspectives on ICT support for the management of knowledge each reviewing relevant literature and revealing a facet of how we can conceptualize the role of technology for knowledge management.
The four technology discourses focus on the: Production and distribution of information; interpretation and navigation of information; codification and embedding of collaboration; and establishment and maintenance of connections.Continue


The Relationship Between Knowledge and Understanding



by Michelle Jenkins

I've been thinking a lot lately about the relationship between knowledge and understanding. Knowledge and understanding, I think, are quite different sorts of things. My general grasp of the nature of understanding is influenced largely by the Ancients. One understands something if she 1) is able to provide a comprehensive explanation 2) has a systematic grasp of all of the information and 3) can defend her explanation against any questions or criticisms. First, in order to understand something I must be able to provide a comprehensive explanation of it. A physicist, for example, who understands the theory of relativity must be able to provide an explanation about why the theory of relativity is as it is, how it works, how it affects a variety of other physical laws and observations, and so forth. Second, to understand something, one must be able to `see' the relationship between different bits of information in the whole of the field to which the bit of information belongs. You must have a systematic grasp of the information relating to the matter at hand such that you see that the information, and the relationships that the different bits of information have with one another, forms an almost organic whole. Thus, a car mechanic who understands why the part of the car is making the sound that it is, has this understanding because he has a systematic grasp of the whole of the vehicle. He knows how the different parts relate to each other and how and in what ways certain conditions will affect both the different parts of the vehicle and the vehicle as a whole. This ties closely into the need for a comprehensive explanation. The physicist (or car mechanic) is able to provide a comprehensive explanation about the thing that she understands because she understands and `sees' the thing as a whole, as part of a complete system. Finally, in order to understand something, one must be able to defend her claim against any criticisms that are leveled against it. This defense must itself be explanatory. One cannot defend her view by pointing to the words of another, but must defend it by demonstrating an ability to look at the issue in a variety of ways and as part of a systematic whole. She is not proving her certainty with regard to an issue, but is demonstrating her understanding of the issue. In defending her view successfully, she demonstrates a reliability and stability within her account. Apparent in this account of understanding (I hope!) is that one must have a rather large web of information about the matter which one claims to understand. In order to develop and defend a suitably comprehensive explanation, one must be able to employ a huge number (and variety) of facts and bits of knowledge that relate to the thing she claims to understand. And, as the systematicity requirement shows, that web of information must be structured in a systematic manner.Continue

Not Every Truth Can Be Known:
at least, not all at once




According to the knowability thesis , every truth is knowable. Fitch's paradox refutes the knowability thesis by showing that if we are not omniscient, then not only are some truths not known, but there are some truths that are not knowable. In this paper, I propose a weakening of the knowability thesis (which I call the "conjunctive knowability thesis") to the effect that for every truth pthere is a collection of truths such that
(i) each of them is knowable and
(ii) their conjunction is equivalent to p.

I show that the conjunctive knowability thesis avoids triviality arguments against it, and that it fares very differently depending on one other issue connecting knowledge and possibility. If some things are knowable but false , then the conjunctive knowability thesis is trivially true. On the other hand, if knowability entails truth, the conjunctive knowability thesis is coherent, but only if the logic of possibility is quite weak.Continue


Some Thoughts About the Relationship Between Information and Understanding



Michael O. Luke
Paper to be presented at the American Society for Information Science Conference, San Diego, CA, May 20-22, 1996

That there is a relationship between information and understanding seems intuitively obvious. If we try to express this relationship mathematically, however, it soon becomes clear that the relationship is complex and mysterious. Knowing more about the connection, however, is important, not the least because we need more understanding as our world becomes faster paced and increasingly complex. The influence of increasing the amount of information, increasing the effectiveness of information mining tools and ways of organizing information to aid the cognitive process are briefly discussed.Continue

Posted by Tony Marmo at 00:01 BST
Updated: Monday, 9 August 2004 08:20 BST
Monday, 2 August 2004

Topic: HUMAN SEMANTICS

Constructions and Formal Semantics


by Marc Moffett
Source: Close Range June 27, 2004


I have been arguing, for instance in my dissertation, that the correctness of Construction Grammar is pretty much uncontroversial. The point, basically, is that no one has ever proposed a semantic theory for even a simple language that doesn't assume the existence of at least one linguistic construction, usually the subject-predicate construction. So in my view those guys over in Berkeley (and their followers) are on pretty solid ground. (The only way I can see to avoid this conclusion is to argue that predication, or function-application, isn't part of the semantics, but something extra.)

Unfortunately, in virtue of not taking explicit account of the role of constructions in their philosophical semantics, philosophers (and linguistic semanticists) philosophers of language have been led to, in my estimation, very implausible linguistic theses. My personal bugbear is the doctrine of logical forms, construed as a linguistic thesis. I want to be clear here that, although I am not convinced of the need for a level LF in syntax, that notion of logical form is far too weak to do the sort of work required by the sorts of robust semantic analyses posited these days. (Think, for instance, of the neoDavidsonian analysis of eventive sentences!) In order to accomodate these robust semantic analyses, the underlying logical forms would have to be vastly more complex than can be reasonably motivated on purely syntactic grounds.

So why have so many philosophers been suckered into accepting them? I'm not sure, but I wonder if it doesn't arise in part from an implicit acceptance of the Fregean view of the language-proposition relation. According to Frege (or, at least, Dummett's Frege), our only cognitive access onto propositions is via the linguistic structure of the sentences that express them. If Frege's Thesis is correct, then the need for a robust semantics will require a correspondingly complex underlying linguistic structure.

[It is also worth considering, in this quasi-historical context, whether or not Russell's notion of contextual definition and the associated doctrine of "incomplete symbols" doesn't mark out an inchoate construction-based theory of language.]


Comments


Jason Stanley

Your question should be -- why have so many *linguists* been suckered into accepted logical form, with rich covert syntactic structures. Once the point is put in this more adequate manner, it becomes clear you're being more than a little dogmatic.
Those philosophers who do accept rich logical forms do so, because, in taking syntax classes for many years, we've been introduced to the notion of a rich logical form with lots of covert structure (is Richard Larson in a philosophy department? Is Chomsky in a philosophy department? Pesetsky?). Robert May's book on logical form in the 1980's had a big impact on syntax and semantics, and many of us who started doing linguistics then were doing GB, and read that book. Minimalist syntax makes different assumptions than GB, and seeks to explain different evidence. But, if anything, it postulates much more covert structure.
In my experience, it's *philosophers* who are reluctant to buy linguistic arguments for covert structures.

Part of the problem has to do with what's meant by "purely syntactic grounds". If what you mean is, on the basis of judgements of grammaticality and ungrammaticality alone, then that is simply an oversimplistic conception of "purely syntactic grounds". For example, we distinguish bound vs. free readings of pronouns not on the grounds of grammaticality, but on the grounds that they give rise to different readings. We appeal to different potentential attachment sites of modifiers as arguments for underlying constituent structures. And so on -- so your post assumes some conception of "purely syntactic grounds" that is overly philosophical in nature.

Posted by Tony Marmo at 17:15 BST
Updated: Monday, 9 August 2004 07:38 BST
Wednesday, 28 July 2004
TIME-SPACE AND SYNTAX-SEMANTICS
Topic: HUMAN SEMANTICS

RECOMMENDED BOOKS:



BOOK #1

TIME, TENSE AND REFERENCE


Edited by Aleksandar Jokic &Varieties of Meaning
The 2002 Jean Nicod Lectures
By Ruth Garrett Millikan


Many different things are said to have meaning: people mean to do various things; tools and other artifacts are meant for various things; people mean various things by using words and sentences; natural signs mean things; representations in people's minds also presumably mean things. In Varieties of Meaning, Ruth Garrett Millikan argues that these different kinds of meaning can be understood only in relation to each other.

What does meaning in the sense of purpose (when something is said to be meant for something) have to do with meaning in the sense of representing or signifying? Millikan argues that the explicit human purposes, explicit human intentions, are represented purposes. They do not merely represent purposes; they possess the purposes that they represent. She argues further that things that signify, intentional signs such as sentences, are distinguished from natural signs by having purpose essentially; therefore, unlike natural signs, intentional signs can misrepresent or be false.

Part I discusses "Purposes and Cross-Purposes" -- what purposes are, the purposes of people, of their behaviors, of their body parts, of their artifacts, and of the signs they use. Part II then describes a previously unrecognized kind of natural sign,
"locally recurrent" natural signs, and several varieties of intentional signs, and discusses the ways in which representations themselves are represented. Part III offers a
novel interpretation of the way language is understood and of the relation between semantics and pragmatics. Part IV discusses perception and thought, exploring stages in the development of inner representations, from the simplest organisms whose behavior is governed by perception-action cycles to the perceptions and intentional attitudes of humans.
Quentin Smith

Among the many branches of philosophy, the philosophy of time and the philosophy of language are more intimately interconnected than most, yet their practitioners have long pursued independent paths. This book helps to bridge the gap between the two groups. As it makes clear, it is increasingly difficult to do philosophy of language without any metaphysical commitments as to the nature of time, and it is equally difficult to resolve the metaphysical question of whether time is tensed or tenseless independently of the philosophy of language. Indeed, one is tempted to see philosophy of language and metaphysics as a continuum with no sharp boundary.

The essays, which were written expressly for this book by leading philosophers of language and philosophers of time, discuss the philosophy of language and its implications for the philosophy of time and vice versa. The intention is not only to
further dialogue between philosophers of language and of time but also to present new theories to advance the state of knowledge in the two fields. The essays are organized in two sections -- one on the philosophy of tensed language, the other
on the metaphysics of time.


link

BOOK #2

THE SYNTAX OF TIME


Edited by Jacqueline Gueron and Jacqueline Lecarme


Any analysis of the syntax of time is based on a paradox: it must include a syntax-based theory of both tense construal and event construal. Yet while time is undimensional, events have a complex spatiotemporal structure that reflects their human participants. How can an event be flattened to fit into the linear time axis?
Chomsky's The Minimalist Program, published in 1995, offers a way to address this problem. The studies collected in The Syntax of Time investigate whether problems concerning the construal of tense and aspect can be reduced to syntactic problems for which the basic mechanism and principles of generative grammar already provide solutions.

These studies, recent work by leading international scholars in the field,offer varied perspectives on the syntax of tense and the temporal construal of events: models of tense interpretation, construal of verbal forms, temporal aspect versus lexical aspect, the relation between the event and its argument structure, and the interaction of case with aktionsart or tense construal. Advances in the theory of temporal interpretation in the sentence are also applied to the temporal interpretation of nominals.


link

Posted by Tony Marmo at 13:01 BST
Updated: Monday, 9 August 2004 07:44 BST

Topic: Cognition & Epistemology

Contrastivism and Hawthorne?s principle of practical reasoning


by Jon Kvanvig
Source: Certain Doubts 7/26/2004


Contrastivism holds that the truth makers for knowledge attributions always involve a contrast, and Hawthorne thinks that if you know something, you are entitled to use it in practical reasoning. So one way to test what it is known is to see what kinds of practical reasoning we?ll allow are acceptable.

Depending on what the contrast is, contrastive knowledge may be easy or hard to have. So, it is easier to know ?the train will be on time rather than a day late? than it is to know ?the train will be on time rather than 2 minutes late.? One way to put the difference is that one is presupposing more in knowing the first claim that one is in knowing the second.

Consider then a piece of practical reasoning using the following conditional: if you are pointing a gun at me, and if your gun is loaded and if you intend to shoot me, I should shoot you first. Suppose I know that you are pointing a gun at me rather than a twig, and that I know that your gun is loaded rather than having just been disassembled for cleaning, and suppose I know that you intend to shoot me rather than give me a million bucks. Should I shoot you? Maybe this is an anti-gun sentiment coming out, but I think it is far from obvious that I should.

Compare this case to another. In this case, I know that you are pointing a gun at me rather than any non-lethal item, and I know that your gun is loaded rather than merely having the appearance of being loaded from where I stand, and I know that you intend to shoot me rather than anyone else in the universe. Now I think I should shoot you first.

Why the difference? In contrastivist language, I?m presupposing too much in the first case, I think. The knowledge I have is easy knowledge because it presupposes so much. I?m presupposing that the thing in your hand is either a gun or a twig, that it?s either loaded or disassembled, that you intend to shoot me or make me rich. In the context of these assumptions, it?s too easy to come to the conclusion that I should shoot first and ask questions later. In the second, case, however, my presuppositions are much broader, broad enough that my knowledge is no longer easy. And since my knowledge is not easy, I doubt I could be faulted on grounds of rationality for shooting first.

It appears, then, that contrastivists will have to deny Hawthorne?s principle. Moreover, I don?t see any obvious way to qualify the principle for the following reason. If the action is relatively inconsequential, then easy knowledge may be enough to warrant performing the action. But if the action is immensely significant, as it is in the case of taking a life, then easy knowledge doesn?t seem to be enough.

One way to think about such cases is that they may provide a reason for including pragmatic issues in one?s account, either indirectly as contextualists typically do or directly as we find in the invariantist camp. Or maybe a reason for rejecting Hawthorne?s principle??

Posted by Tony Marmo at 12:28 BST
Updated: Monday, 9 August 2004 07:45 BST

Topic: SCIENCE & NEWS

John Passmore


*1914
+2004
Source: Philosophy Program of the
Research School of Social Sciences
of the Australian National University


John Passmore was Reader in Philosophy at the Research School of Social Sciences, ANU, from 1955 through until 1957, and Professor of Philosophy from 1958 until his retirement in 1979. He was Head of the Philosophy Program from 1962 until 1976. Passmore's book A Hundred Years of Philosophy was recognised as a major feat of philosophical scholarship throughout the international philosophical community.

It was followed by influential books on a whole range of issues, including Man's Responsibility for Nature, one of the first books on the philosophical issues raised by the environmental movement. Passmore was one of the very first to give shape to what is now, under his influence, called 'applied philosophy.' His many books have been translated into a wide variety of languages. He remains a major figure in the history of ideas. In recognition of his service to education, Passmore was made a Companion in the General Division of the Order of Australia in 1992. The first volume of his autobiography, Memoirs of a Semi-detached Australian, was published by Melbourne University Press in 1997.

He was Emeritus Professor of Philosophy and Visiting Fellow in the History Program at RSSS, and died in Canberra on Sunday.


Obituary


Posted by Tony Marmo at 05:48 BST
Updated: Monday, 9 August 2004 07:46 BST
Tuesday, 27 July 2004
A NEW VERSION OF RADICAL MENTALISM?
Topic: SCIENCE & NEWS

Be warned, this could be the matrix


Source: The Sidney Morning Herald, July 22, 2004

The multiverse theory has spawned another - that our universe is a simulation, writes Paul Davies.


If you've ever thought life was actually a dream, take comfort. Some pretty distinguished scientists may agree with you. Philosophers have long questioned whether there is in fact a real world out there, or whether "reality" is just a figment of our imagination.

Then along came the quantum physicists, who unveiled an Alice-in-Wonderland realm of atomic uncertainty, where particles can be waves and solid objects dissolve away into ghostly patterns of quantum energy.

Now cosmologists have got in on the act, suggesting that what we perceive as the universe might in fact be nothing more than a gigantic simulation.

The story behind this bizarre suggestion began with a vexatious question: why is the universe so bio-friendly? Cosmologists have long been perplexed by the fact that the laws of nature seem to be cunningly concocted to enable life to emerge. Take the element carbon, the vital stuff that is the basis of all life. It wasn't made in the big bang that gave birth to the universe. Instead, carbon has been cooked in the innards of giant stars, which then exploded and spewed soot around the universe.
Advertisement Advertisement

The process that generates carbon is a delicate nuclear reaction. It turns out that the whole chain of events is a damned close run thing, to paraphrase Lord Wellington. If the force that holds atomic nuclei together were just a tiny bit stronger or a tiny bit weaker, the reaction wouldn't work properly and life may never have happened.

The late British astronomer Fred Hoyle was so struck by the coincidence that the nuclear force possessed just the right strength to make beings like Fred Hoyle, he proclaimed the universe to be "a put-up job". Since this sounds a bit too much like divine providence, cosmologists have been scrambling to find a scientific answer to the conundrum of cosmic bio-friendliness.

The one they have come up with is multiple universes, or "the multiverse". This theory says that what we have been calling "the universe" is nothing of the sort. Rather, it is an infinitesimal fragment of a much grander and more elaborate system in which our cosmic region, vast though it is, represents but a single bubble of space amid a countless number of other bubbles, or pocket universes.

Things get interesting when the multiverse theory is combined with ideas from sub-atomic particle physics. Evidence is mounting that what physicists took to be God-given unshakeable laws may be more like local by-laws, valid in our particular cosmic patch, but different in other pocket universes. Travel a trillion light years beyond the Andromeda galaxy, and you might find yourself in a universe where gravity is a bit stronger or electrons a bit heavier.

The vast majority of these other universes will not have the necessary fine-tuned coincidences needed for life to emerge; they are sterile and so go unseen. Only in Goldilocks universes like ours where things have fallen out just right, purely by accident, will sentient beings arise to be amazed at how ingeniously bio-friendly their universe is.

It's a pretty neat idea, and very popular with scientists. But it carries a bizarre implication. Because the total number of pocket universes is unlimited, there are bound to be at least some that are not only inhabited, but populated by advanced civilisations - technological communities with enough computer power to create artificial consciousness. Indeed, some computer scientists think our technology may be on the verge of achieving thinking machines.

It is but a small step from creating artificial minds in a machine, to simulating entire virtual worlds for the simulated beings to inhabit. This scenario has become familiar since it was popularised in The Matrix movies.

Now some scientists are suggesting it should be taken seriously. "We may be a simulation ... creations of some supreme, or super-being," muses Britain's astronomer royal, Sir Martin Rees, a staunch advocate of the multiverse theory. He wonders whether the entire physical universe might be an exercise in virtual reality, so that "we're in the matrix rather than the physics itself".

Is there any justification for believing this wacky idea? You bet, says Nick Bostrom, a philosopher at Oxford University, who even has a website devoted to the topic ( http://www.simulation-argument.com ). "Because their computers are so powerful, they could run a great many simulations," he writes in The Philosophical Quarterly .

So if there exist civilisations with cosmic simulating ability, then the fake universes they create would rapidly proliferate to outnumber the real ones. After all, virtual reality is a lot cheaper than the real thing. So by simple statistics, a random observer like you or me is most probably a simulated being in a fake world. And viewed from inside the matrix, we could never tell the difference.

Or could we? John Barrow, a colleague of Martin Rees at Cambridge University, wonders whether the simulators would go to the trouble and expense of making the virtual reality foolproof. Perhaps if we look closely enough we might catch the scenery wobbling.

He even suggests that a glitch in our simulated cosmic history may have already been discovered, by John Webb at the University of NSW. Webb has analysed the light from distant quasars, and found that something funny happened about 6 billion years ago - a minute shift in the speed of light. Could this be the simulators taking their eye off the ball?

I have to confess to being partly responsible for this mischief. Last year I wrote an item for The New York Times , saying that once the multiverse genie was let out of the bottle, Matrix -like scenarios inexorably follow. My conclusion was that perhaps we should retain a healthy scepticism for the multiverse concept until this was sorted out. But far from being a dampener on the theory, it only served to boost enthusiasm for it.

Where will it all end? Badly, perhaps. Now the simulators know we are on to them, and the game is up, they may lose interest and decide to hit the delete button. For your own sake, don't believe a word that I have written.


Paul Davies is professor of natural philosophy at Macquarie University's Australian Centre for Astrobiology. His latest book is How to Build a Time Machine.

Posted by Tony Marmo at 17:25 BST
Updated: Monday, 9 August 2004 07:55 BST
Monday, 26 July 2004
QUESTION ON ENGLISH
I have a question to all English native speakers around the world. Given the pair of sentences below:

(1) The postman is evil.
(2) The postman is like evil.


What are the meaning differences you grasp when you read these two sentences?

I thank you for your replies.

Posted by Tony Marmo at 17:40 BST
Updated: Monday, 26 July 2004 17:46 BST
ON COUNTERFACTUALS [2]
Topic: GENERAL LOGIC

Predictive Prophecy and Counterfactuals


by Jeremy Pierce

Source: Orange Philosophy June 25, 2004

In line with the discussions of time and time travel at my blog and to some degree here also, our own Gnu has a related puzzle using a fun fantasy role-playing kind of example for a philosophical puzzle about conditional predictive prophecy (i.e. predicting what someone will do and then telling him that A will have already happened if he ends up doing P but B will have already happened if he turns out to do Q). I think this case is interesting in terms of its view of time and of the relation of guaranteed prediction to time, but it also has some relevance to how to evaluate counterfactual statements. Read the case first at Gnu's blog, then read on here for my analysis.

The Liche Lord has predicted what Thurvan would do. That means he knew that Thurvan would go to all the rooms. Therefore, assuming he isn't lying, he hasn't placed the sword in the room he said it would be in if Thurvan had chosen not to go to the other rooms. Thurvan was correct to say that the sword is either there or not, but he was wrong to think that it was there independent of his decision. It was there because of what he would do. If Thurvan had chosen otherwise, and the Liche Lord had still set up the same deal, the sword would have been there. But unless he's lying, the sword can't be there as things stand. Given that the Liche Lord can take the shape of any object and enjoys taking people to be his undead slaves, you might expect that what the dwarf sees as the sword is probably the Liche Lord himself waiting to trap him.

Of course, the Liche Lord can see the future, so this is probably only the case if the Liche Lord has predicted that the dwarf will take the sword. He may well have predicted that the dwarf would reason through all this and leave without going for the sword, in which case he may have lied and put the sword there anyway. What's great about this is that the sword might really be there but only if he doesn't try to get it, and it's not there if he does. So he can't get it one way or the other. The only way to get the sword would have been to do what the Liche Lord knew he wouldn't do, and that would have been to avoid the other rooms.

In working through this, I had a hard time thinking about what the Liche Lord would have done if Thurvan had chosen otherwise, because it may well be that the Liche Lord would not have chosen to set this scenario up at all without the knowledge of Thurvan choosing the way he did. It's hard to think about counterfactual possibilities where the thing that would have been different depends on knowledge of the future in the counterfactual world.

According to David Lewis' semantics of counterfactual statements, 'if Thurvan had chosen to go straight to the sword room, the sword would have been there' is unclear to me. Lewis says to go to the nearest possible world where Thurvan goes straight to the sword room, meaning that you should find the world most like the actual world except for that detail and then see what's true. So if we change nothing in the world except that and what changing it will require, what happens? I can think of three kinds of candidate worlds for the closest:

1. My first thought would be to say that if Thurvan had chosen differently, and if you kept as much intact as possible, then the Liche Lord would have predicted differently and as a result put the sword in the chamber to honor his deal. This world holds the Liche Lord's honesty and abilities constant and changes the state of the world for the entire time between the writing of the letter and the present so that the sword has been there all along.

2. Lewis prefers to find a world intrinsically as much like the actual world as possible. That would require keeping the tomb , just as things are in the actual world. But then the Liche Lord would have to have told something false to Thurvan. Either he was lying (2a), or his predictive abilities failed in this one case (2b). I think Lewis has to favor 2b, because even 2a has intrinsic changes with the Liche Lord's beliefs and intents, whereas 2b could be just a surprising failure of his abilities, something like the miracle worlds Lewis discusses in his paper on whether free will requires breaking the laws of nature.

3. Lewis wouldn't like this at all, because it requires even more of a change of the intrinsic state of the world so far than 1, but some might argue that if Thurvan had chosen to take the sword and not go to the other rooms, the Liche Lord would not have set up the case this way at all and wouldn't have given a deal that would mean he'd end up losing. I'm bring this up only to argue against it as a legitimate near possibility. Seeing this as a near possibility of what would happen given Thurvan's choice to go only for the sword assumes something false. It assumes the Liche Lord is predicting what Thurvan would do given that the Liche Lord sets things up a certain way. According to Gnu's setup, the Liche Lord predicts what Thurvan will do, period. He doesn't consider all the possibilities and make things go his way. His ability only tells him what will happen. So this one requires a difference in the intrinsic state of the world and in the abilities of the Liche Lord. 1 has a difference only in the state of the world (and not even as much of a difference), and 2 has a difference in the abilities or intent of the Liche Lord (and not as much of a difference -- either a one-time failure of the same ability rather than a completely different ability or a different motivation rather than a whole change in the nature of his abilities).

So I think 1 and 2 are the real options for which world is closest to the actual one Gnu has constructed. This is a particularly vivid example of those who agree with Lewis on nearness of worlds based on intrinsic likeness and what I think is the more commonsense view of nearness of worlds based on preserving the abilities of the Liche Lord that related causally to the future in certain guaranteed ways. Lewis' view is required for those who reduce causality to relations between instrinsic properties of things across time, and my intuitions against his view on this case are therefore intuitions against his reduction of causality to such things. The causal relations between the Liche Lord and the future that he sees are an important part of the structure of the world, and a world seems to me to be much further from the actual one (of the case) if the Liche Lord has to have different abilities or failure of his abilities to keep the world intrinsically as close as possible. Simply changing some more intrinsic facts seems to me to be less of a change.

Posted by Tony Marmo at 15:17 BST
Updated: Monday, 9 August 2004 07:59 BST
Sunday, 25 July 2004

Topic: GENERAL LOGIC

Seminar on Plurality


By MarkSteen
Source: Orange Philosophy, July 24, 2004


Tom McKay approved of my idea of posting his announcement about his seminar on plural quantification, along with related topics (such as non-distributive predication). Tom has a new book on this subject which you can check out by clicking on the departmental webpage on the links list, then clicking on faculty, then McKay [sorry, for some reason my linking feature isn't working now].

I think some local-ish non-Syracusan (e.g., Cornell, Rochester) folk might be interested in attending. Here's the announcement [note that Tom will not have computer access until the end of the month and so you should wait a bit to email him or post questions here for him until August]:


Seminar, Fall 2004, on "Plurality"
(McKay)

There are lots of topics, and I want students' own interests to determine some of what we do.

My fundamental project (in a book I have just finished) has been to explore the issue of expanding first-order predicate logic to allow non-distributive predication. A predicate F is distributive iff whenever some things are F, each of them is F. Consider:

(1) They are students. They are young.
(2) They are classmates. They are surrounding the building.
The predications in (1) are distributive, but the predications in (2) are non-distributive. Non-distributive plural predication is irreducibly plural. In ordinary first-order logic, only distributive predicates are employed.

The incorporation of irreducibly plural predicates is related to a wide range of issues in metaphysics, philosophy of language, foundations of mathematics, logic, and natural language semantics. Some of the issues that we might consider:

What is the nature of plurality? How should we think of the relations among sets, mereological sums, pluralities and individuals? What (if anything) are these different ontological kinds, and how are they related? Can one thing be many? (Is one deck identical to the 52 cards? Or is this not an identity relation?)

Singular and plural predication; singular and plural quantification; singular and plural reference. How do those fit together? When we consider the full range of determiners in English and try to incorporate quantifiers to represent that, there are many interesting semantic issues to resolve.

How does the semantics of plurality relate to the semantics of mass terms?

In the foundations of mathematics, how far can plurals take us without set theory? What is the relationship of second-order logic to plurals and to the foundations of mathematics?

What is the nature of ontological commitment? What does semantics commit the semanticist to? What does it say speakers are committed to? (For example, if I say that the analysis of adverbs requires an event semantics, does that mean that an ordinary user of adverbs is committed to the existence of events? This kind of issue becomes interesting when we look at the semantics of plurals.)

Can we talk about everything without paradox? Are plurals a special resource to enable us to do so?

A large number of issues about the relationship of semantics and pragmatics come together when we consider definite descriptions. Usually discussions focus on singular definite descriptions, but we can see what difference (if any) it makes when we insist that the account be general enough for plural and mass definite descriptions. This then also relates to the consideration of pronominal cross-reference and demonstrative reference.

Some have argued that an event semantics is important for getting plurals right. It will be interesting to look at event semantics and how that relates to plurals.

I will meet with each enrolled student early on in the semester to identify some areas of interest and get started on developing the student's presentation and paper on a topic of the student's choice.

If people are interested in looking into this before the semester begins, my book is available on the department's website: http://philosophy.syr.edu/
(Click on my name in the list of faculty.) Also, Oystein Linnebo has posted a draft of his forthcoming Stanford Encyclopedia article, and it is a good introduction: http://folk.uio.no/oysteinl/. Scroll down to "Plural Quantification."

We will not presume any greater familiarity with logic than you would acquire by being alive and awake through most of PHI 651.

Please get in touch with me if you have questions.

tjmckay@syr.edu

Posted by Tony Marmo at 23:11 BST
Updated: Monday, 9 August 2004 08:02 BST

Newer | Latest | Older