Click Here ">
« June 2005 »
1 2 3 4
5 6 7 8 9 10 11
12 13 14 15 16 17 18
19 20 21 22 23 24 25
26 27 28 29 30
You are not logged in. Log in
Entries by Topic
All topics  «
Cognition & Epistemology
Notes on Pirah?
Ontology&possible worlds
Syn-Sem Interface
Temporal Logic
Blog Tools
Edit your Blog
Build a Blog
RSS Feed
View Profile
Translate this
LINGUISTIX&LOGIK, Tony Marmo's blog
Thursday, 2 June 2005

Topic: Interconnections

Many-valued logic vs. many-valued semantics

By Jaroslav Peregrin

Hence the task of the logician, viewed from this perspective, is the delimitation of the range of acceptable truth-valuations of the sentences of the given language – taking note of all the "lawful" features of the separation of true sentences from false ones. Let us call this the separation problem.
Consider the language of classical propositional calculus (and consequently the part of natural language which it purports to regiment). Here the ensuing "laws of truth" are quite transparent:
(i) ¬A is true iff A is not true
(ii) A^B is true iff A is true and B is true
(iii) A∨B is true iff A is true or B is true
(iv) A→B is true iff A is not true or B is true

Every truth-valuation which fulfills these constraints is acceptable and every acceptable truth-valuation does fulfill them. But the situation is, as is well known, not so simple once we abandon the calm waters of the classical propositional calculus.

Posted by Tony Marmo at 00:01 BST
Wednesday, 1 June 2005

Topic: Cognition & Epistemology

Knowledge and Explanation

By Carrie Jenkins

In this paper I attempt a project of this kind. I propose a necessary and sufficient condition for A knows that p which is, although recognizably similar to the traditional sets of conditions, arguably immune to the kind of counterexample which tends to deter philosophers from thinking that any illuminating conditions can be found. I present this condition, however, not as an analysis of knowledge, but rather as a way of getting a handle on the concept and furthering the effort to understand what its role in our lives might be. Taken in this spirit, the current proposal is not at odds with the principles that motivate Craig?s view.
In denying my proposal the status of a reductive analysis, I am mindful of the fact that it will tell us little more than that knowledge is ?non-accidental true belief?. What it offers is a (hopefully fruitful) way of spelling out what is meant by ?non-accidental? in this context. In what follows, I shall write ?KAp? for ?A knows that p? and ?BAp? for ?A believes that p.? I shall propose that KAp just in case BAp and it can be said (under specific circumstances, to be described shortly) that A believes p because p is true. But this is not a causal account of knowledge. The ?because? signals not causation, but explanation.

To appear in the Canadian Journal of Philosophy
Source: Online Papers in Philosophy

Posted by Tony Marmo at 17:41 BST
Updated: Wednesday, 1 June 2005 17:53 BST
Tuesday, 24 May 2005


The Proper Treatment of Coreference Relations

By Louis-H. Desouvrey

A novel approach to coreference relations is proposed. It is shown that there are no coreference principles per se in the grammar. Rather three constraints independently needed account for this phenomenon: the Oligatory Contour Principle (OCP), the Avoid Ambiguity Constraint (AAC), and the Freedom Constraint. The OCP and the AAC deal with features lexical elements are specified for. Referring elements are indeed distinguished from each other by a relational feature, which represents what the element stands for in the real world. Given nonlinear phonological representations whereby each feature belongs to its own plane, R-features spread from a referential expression to an adjacent referring element, either a pronoun or an anaphor. The ban on line crossing in the representation, which constrains the spreading, accounts for the adjacency between anaphors and their antecedents. The complementarity in the distribution of anaphors and pronouns follows from the feature specification of these elements as well as the interaction of the OCP and the Ambiguity Constraint.

Keywords: coreference, constraints, domains, pronouns, anaphors, syntactic features.
Source: Semantics Archive

The three principles mentioned above are:
(1) Freedom Constraint (FC)
Referring elements must be free in their minimal domain.

(2) Obligatory Contour Principle (OCP)
Two elements bearing identical R-features are banned in the same syntactic domain.

(3) Avoid Ambiguity Constraint (AAC)
Morphological ambiguity must be avoided whenever possible.

Posted by Tony Marmo at 00:01 BST
Updated: Tuesday, 24 May 2005 14:37 BST
Sunday, 15 May 2005


Tense and Choice Functions

By Mean-Young Song

In this paper, I argue against the two major approaches to the semantics of tense: the quantificational approach and the referential approach. The difference between them lies in the fact that the former is in favor of the indefinite view of tense, whereas the latter the definite view of tense. Tenses are characterized by the variability in the sense that they can be interpreted as being indefinite in some context or definite in other context. The difficulty the two approaches are faced with is that neither takes an appropriate account of the variability of tense. To provide a proper semantic treatment of tense, I propose in this paper a treatment of tense which incorporates choice functions. The choice functions, which apply to a non-empty set of intervals, picks the most salient time the speaker might have in mind at the utterance time. This might assist in capturing the variability of tense.

Key words: tense, definite tense, indefinite tense, the quantification approach, the referential approach, choice functions, presuppositions, temporal predicates

Source: Semantics Archive

Posted by Tony Marmo at 08:10 BST
Saturday, 14 May 2005

Topic: Interconnections

Counterfactuals and historical possibility

by Tomasz Placek and Thomas Muller

We show that truth conditions for counterfactuals need not always be given in terms of a vague notion of similarity. To this end, we single out the important class of historical counterfactuals and give formally rigorous truth conditions for these counterfactuals, employing a partial ordering relation called ``comparative closeness'' that is defined in the framework of branching space-times. Among other applications, we provide a detailed analysis of counterfactuals uttered in the context of lost bets. In an appendix we compare our theory with the branching space-times based reading of counterfactuals recently proposed by Belnap [1992].

Keywords: branching space-times, historical counterfactuals, comparative closeness,

Source: PhilSci Archive, Online Papers in Philosophy

Posted by Tony Marmo at 00:01 BST
Updated: Sunday, 15 May 2005 08:14 BST
Sunday, 8 May 2005

Another interesting introduction to Paraconsistent Logic:


By Anthony Hunter
In practical reasoning, it is common to have “too much” information about some situation. In other words, it is common for there to be classically inconsistent information in a practical reasoning database [Besnard et al., 1995]. The diversity of logics proposed for aspects of practical reasoning indicates the complexity of this form of reasoning. However, central to practical reasoning seems to be the need to reason with inconsistent information without the logic being trivialized [Gabbay and Hunter, 1991; Finkelstein et al., 1994]. This is the need to derive reasonable inferences without deriving the trivial inferences that follow the ex falso quodlibet proof rule that holds in classical logic.
[Ex falso quodlibet]
α ¬α

So for example, from a database {α, ¬α, α→β, δ} reasonable inferences might include α, ¬α, α→β, and δ by reflexivity, β by modus ponens, α^;β by introduction, ¬α→¬β and so on. In contrast, trivial inferences might include γ, γ^¬δ, etc, by ex falso quodlibet.
Solutions to the problem of inconsistent data include database revision and
paraconsistent logics. The first approach effectively removes data from the database to produce a new consistent database. In contrast, the second approach leaves the database inconsistent, but prohibits the logics from deriving trivial inferences. Unfortunately, the first approach means we may loose useful information— we may be forced to make a premature selection of our new database, or we may not even be able to make a selection. We consider here the advantages and disadvantages of the paraconsistent approach.
The primary objective of this chapter is to present a range of paraconsistent logics that give sensible inferences from inconsistent information.

Posted by Tony Marmo at 09:36 BST
Friday, 29 April 2005


Foundations of Attentional Semantics

By Giorgio Marchetti

Words are tools that pilot attention. As such, they can be analyzed in terms of the attentional changes they convey. In this article, the process by which words produce attentional changes in the subject hearing or reading them is examined. A very important step in this process is represented by the subject’s conscious experience of the meaning of words. The conscious experience of the meaning differs from the conscious experience of images and perceptions in that the former lacks the qualitative properties of the latter; moreover, while meanings refer to a whole class of objects or events, images and perceptions do not. The peculiar quality and the context- and object-independent character of this form of consciousness is determined by the kind of elements that constitute meanings: attentional operations. As shown by the psychological literature on attention, and by personal experience, attention can be variously piloted to perform several kinds of operations: it can be oriented, focused at variable levels of size and intensity, sustained for variable amounts of time, each single attentional operation can be variously combined with other attentional operations, forming an orderly, albeit complex, sequence of attentional operations, etc. Each meaning is composed of a specific sequence of attentional operations. This sequence represents the skeleton that supports and allows the conversion or actualization of the meaning into any of its sensible, perceptible instances. It is precisely the presence of the attentional operations that induces in the subjects the conscious experience of the meaning.
The subject learns the meanings of words by focusing its attention on the attentional operations that constitute them. The capacity, here labelled as a meta-attentional one, to isolate the attentional operation constituting the meaning does not entail a secondary process occurring at a different level from, but simultaneous with, the primary process to be analyzed. On the contrary, the analyzing process occurs at the same level - the conscious one - as the analyzed process, but a moment later: the subject becomes conscious of the attentional operations constituting the meaning simply by performing them.
Once the process that makes it possible for words to produce attentional changes is explained, and the kind of
components constituting meanings is identified, the foundations of Attentional Semantics are laid: the road to analyze the meaning of words in terms of attentional operations is open.

Keywords: attention, conscious experience, meaning, words, language, attentional operations, semantics
Source: Semantics Archive

Posted by Tony Marmo at 16:19 BST
Updated: Friday, 29 April 2005 16:20 BST
Tuesday, 26 April 2005


The Meta-Fibring Environment: Preservation of meta-properties by fibring

By Marcelo E. Coniglio

In this paper the categories Mcon and Seq of multiple-conclusion consequence relations and sequent calculi, respectively, are introduced. The main feature of these categories is the preservation, by morphisms, of meta-properties of the consequence relations. This feature is obtained by changing the usual concept of morphism between logics (that is, a signature morphism preserving deductions) by a stronger one (a signature morphism preserving meta-implications between deductions). This allow us to obtain better results by fibring objects in Mcon and in Seq than using the usual notion of morphism between deduction systems:
In fact, meta-fibring (that is, fibring in the proposed categories) avoids the phenomenon of fibring that we call anti-collapsing problem, as opposite to the well-known collapsing problem of fibring. Additionally, a general semantics for objects in Seq (and, in particular, for objects in Mcon) is proposed, obtaining a category of logic systems called Log. A general theorem of preservation of completeness by fibring in Log is also obtained.
Source: CLE

Posted by Tony Marmo at 21:42 BST
Updated: Tuesday, 26 April 2005 21:44 BST
Thursday, 21 April 2005

Topic: Syn-Sem Interface


In the following I shall show that there is not one single notion of phase in Chomsky (2001) and that each of the different formulations have diverse consequences.

The set of analyses and conclusions delineated in my paper (forthcoming) and Chomsky?s (1999) proposals converge to the notion that the phases of a derivation are propositional. Indeed, as Chomsky himself acknowledges, verbal phrases have full argument structure and CPs force indicators. However the same notion of phase in Chomsky (2001) is derived throughout other four different paths.

The first path to derive the notion of phase, which Chomsky has chosen, departs from the inclusiveness condition. He maintains his 1995 idea that the input of a derivation is an array of items taken from the lexicon (LA), but divides it into sub-arrays. Each phase is determined by a sub-array LA{ i } of LA, placed in the active memory. When the computation exhausts LA{ i }, forming the syntactic object K, L returns to LA, either extending K to K' or forming an independent structure M to be assimilated later to K or to some extension of K.

These premises seem to pervade Chomsky?s thoughts, but in discussing what categories demark the phases, i.e., how to label them, he presents a second view, which furthers the first vie. So, he argues that a sub-array LA{i} must be easily identifiable and so contain exactly one lexical item that will label the resulting phase. If one accepts that substantive categories are selected by functional categories, namely V by a light verb v and T by C, then one gets the following thesis:
MT(1) Phases are CP and vP, and a sub-array contains exactly one C or v.

The third alternative to circumscribe phases consists of finding and summoning facts that maintain and correlate PF and LF integrity more generally, what Chomsky claims to be independent support:
MT(2) CP and vP are reconstruction sites, and have a degree of phonetic independence.

The fourth way consists of adopting the idea that the main functional categories, which may have an EPP feature and so function as targets for movement, are phases, but there are divided accordingly to their strength:
MT(3) CP and vP are strong phases, all the others are weak.
Accordingly to the thesis embraced, the spell out operation will apply simply at a phase or at a strong phase level. The options might all seem very stipulative, put this way, but they all play a crucial role in Chomsky?s consistent argumentation in favour of the idea that only the phonological component proceeds in parallel, and not the syntactic and semantic ones. This is expressed in the minimalist thesis bellow:
MT(4) a. There is no overt-covert distinction with two independent cycles; rather, a single narrow-syntactic cycle;
b. The phonological cycle is not a third independent cycle, but proceeds essentially in parallel.

We have already explained the distinction between delete and erase in Section 1, which underlies the reasoning above: by making the deleted features disappear, convergence at LF is allowed. In order to reduce computational burden and allow the phonological component to slight earlier stages of the derivation, another stipulation, the phase impenetrability condition, is added:
(PIC) Only H and its edge are accessible to operations outside a strong phase HP.

The edge being either specs or elements adjoined to HP. Under PIC, operations apply at the accessible elements of HP within the smallest strong ZP phase containing HP and not beyond, and the phonological component spells out elements that move no further. But this means that spell out interprets H and its edge as parts of ZP in a structure like:
(1) [ZP Z... [HP _ | [H YP]]]

Which leads to Ev1, already mentioned.
Let us summarise the dilemmas resulting from these views:
D1 a. A phase is the product of derivational procedures that divide lexical arrays, obeying the inclusiveness condition.
b. A phase is a special functional selector category.

D2 a. The motivation to postulate the existence of phases is semantic: they are characterised as propositional and as reconstruction sites.
b. The motivation to postulate the existence of phases is phonological: they are characterised by the cyclic application of spell-out.

D3 a. Functional categories are divided into phases and non-phases.
b. All functional categories are phases. Phases are divided into weak and strong types.

Neither alternative in the third dilemma is actually good, for phase should be maintained as a derivational notion and not treated as categorial one, by which reason (D1b) must also be discarded. Perhaps (D2) can never be solved, but it permits us to sustain the view that the notion of phase rests upon the application of an operation.
Indeed it has already been argued for the elimination of PIC in the literature. For instance, Stjepanovic' and Takahashi (2003) claim that the effects of PIC can be derived from independently necessary computational mechanisms, such as multiple spell-out and pied-piping. But any arguments in this direction also provide the means to derive the notion of phase from the application of spell-out. If, on the other hand, spell-out is not the only correspondence operation that maps the Syntactic Structure onto a representational level, i.e., if spell-out has a sister reading operation that interacts with the semantic level, then the two operations may be ordered, the application of second operation culminating in the formation of a phase. The only argument for assuming that the second correspondence operation is reading is the empirical observation that the Phonetic Structure seems always to be delayed in relation to the Conceptual Structure.
So, we have the point that PIC is not necessary, for it can be a mere effect of the cyclic application of different operations, as well as a consequence of a version of Godel?s diagonal lemma. Nevertheless, (D2b) also reveals the weak point of the narrow syntax hypothesis: it does not explain why the phonological cycle should essentially proceed in parallel, as stated in MT(7), but the semantic one should not.
(D1a) is the truly principled alternative among all others.

Posted by Tony Marmo at 00:01 BST
Updated: Friday, 22 April 2005 01:44 BST
Wednesday, 20 April 2005

Topic: Interconnections

The Ken Hale Prize

Throughout his life Ken Hale has set an example of what competence means in the academic circle: a person of great skills using them to help others to construct positive things. He combined both ethics and expertise in his daily work.
Before his passage, he had asked that any donations in his memory be made to the Navajo Language Academy an organisation of Navajo scholars who he had trained.

I think that a Ken Hale Prize should be created for socially conscious linguists and/or scholars, and for those who have helped to promote and foster Linguistics, stimulating new linguists and the diffusion of the works of linguists.

Posted by Tony Marmo at 15:40 BST

Answer to Noel Hunt

In a reaction to my entry on Linking, Noel Hunt wrote me asking:
I was wondering if you could provide a bibliography for the references you cite in your short note.

Well, to all that may be concerned, the references, as far as I can recall, are:

From the transformational era

Langacker, R. (1966) On the Pronominalisation and the Chain of Command. In Reibel, D.A. & S.A. Schane (eds.), Modern Studies in English. Prentice-Hall, Englewood Cliffs, N.J.
Rosenbaun, Peter (1967) The grammar of English predicate complement constructions. Cambridge, MA: MIT Press.
Postal, Paul M. (1970) On Co-referential Complement Subject Deletion. In Linguistic Inquiry volume 1, number 4.
Jackendoff, Ray (1972) Semantic Interpretation in Generative Grammar. MIT Press, Cambridge, MA.
Chomsky, Noam (1973) Conditions on Transformations. In Anderson and Kiparksy (eds) 1973.
Lasnik, Howard (1976) Remarks on Co-reference. Linguistic Analysis 2, 1-22.

Post-transformational Generative Grammar (Principles and Parameters)

Reinhart, Tanya (1976) The Syntactic Structure of Anaphora. PhD dissertation, MIT, Cambrige Massachusetts.
Reinhart, Tanya (1981) Definite NP Anaphora and C-Command Domains. In Linguistic Inquiry volume 12, number 4.
Chomsky (1980) On Binding. Linguistic Inquiry, volume 11: pages 1-46.
Chomsky (1981) Lectures on Government and Binding (The Pisa Lectures). Mouton de Gruyter, Berlin and New York.
Chomsky (1986) Barriers. MIT Press, Cambrige MA.
Bresnan, Joan. 1982. Control and complementation. In Joan Bresnan (ed.) The mental representation of grammatical relations. Cambridge, MA: MIT Press.
Manzini, Maria Rita (1983). On Control and Control Theory. In Linguistic Inquiry volume 14, number 3.

Linking Theory

Higginbotham, James (1980) Pronouns and Bound Variables. In Linguistic Inquiry volume 11, number 4.
Higginbotham, James (1983) Logical Form, Binding, and Nominals. In Linguistic Inquiry volume 14, number 3.

Later Proposals

Reinhart, Tanya (1999) Binding Theory. In Robert A. Wilson and Frank C. Keil, eds, The MIT Encyclopedia of the Cognitive Sciences, Cambridge Mass: MIT Press.
Heim, Irene (2004) Anaphora and Semantic Interpretation: A Reinterpretation of Reinhart's Approach. Ms.
Reinhart, Tanya (2000) Strategies of Anaphora Resolution. In Hans Bennis, M. Everaert and E. Reuland (eds) Interface Strategies. North Holland Amsterdam.

Posted by Tony Marmo at 06:28 BST
Monday, 18 April 2005

Topic: Cognition & Epistemology

Probability, Modality and Triviality

By Antony Eagle

Many philosophers accept the following three theses:
(1) that probability is a modal concept;
(2) that, if determinism is true, therewould still be objective modal facts; and
(3) that if determinism is true, there are no genuine objective probabilities (chances).

I argue that these 3 claims are inconsistent, and that their widespread acceptance is thus quite troubling. I suggest, as others have, that we should reject the last thesis: objective probability is perfectly compatible with determinism. Nevertheless we must still explain why this thesis seems attractive; I suggest that a subtle equivocation is to blame.

Source: Online Papers in Philosophy

Posted by Tony Marmo at 14:01 BST
Updated: Monday, 18 April 2005 14:03 BST


The Telicity Parameter Revisited

By Hana Filip

The goals of this paper are threefold:
First, to review some recent syntactic accounts of cross-linguistic differences in the expression of telicity in Slavic vs. Germanic languages.
Second, I will argue that the parametric variation in the encoding of telicity cannot be based on a unidirectional specifier head agreement between the verbal functional head linked to the telicity of the VP and the DO-DP in its specifier position, with languages exhibiting two clearly distinct modes of assigning telicity to the functional head. In the simplest terms, in Germanic languages, it is assigned by the DO-DP and in Slavic by the perfective/imperfective aspect of the lexical VP head. Rather, in a given telicity structure in both Slavic and Germanic languages, we actually observe mutual constraints and interactions between the head verb and one of its semantic arguments, namely, the incremental argument.
Third, the variation in the encoding of telicity cannot be limited just to syntactic
factors. Instead, it is semantic (and also pragmatic) factors that ultimately motivate

(i) the phenomena that the syntactic parametric approach tries to capture, and also
(ii) telicity phenomena that are a priori recluded by it, left out or unnoticed. In this connection, I will defend the familiar (though often orgotten) insights of Krifka's (1986 and elsewhere) and Dowty's (1991) mereological theory of telicity.

Source: Semantics Archive

Posted by Tony Marmo at 13:30 BST

Topic: Interconnections


What are we supposed to say when the answer to a question is the question itself? This seems to have been the very puzzle of the studies on human behaviour and nature.

Posted by Tony Marmo at 02:49 BST
Updated: Monday, 18 April 2005 13:21 BST
Friday, 15 April 2005

Topic: Cognition & Epistemology

What justifies that?

By Patrick Hawley

I clarify and defuse an argument for skepticism about justification with the aid of some results from recent linguistic theory. These considerations shed light on discussion about the structure of justification.

Take a look

Posted by Tony Marmo at 00:01 BST
Updated: Sunday, 13 March 2005 23:55 GMT

Newer | Latest | Older