Click Here ">
« April 2024 »
S M T W T F S
1 2 3 4 5 6
7 8 9 10 11 12 13
14 15 16 17 18 19 20
21 22 23 24 25 26 27
28 29 30
You are not logged in. Log in
Entries by Topic
All topics
Counterfactuals
defl@tionism
GENERAL LOGIC
HUMAN SEMANTICS  «
Interconnections
PARACONSISTENCY
Polemics
SCIENCE & NEWS
Cognition & Epistemology
Notes on Pirah?
Ontology&possible worlds
PRAGMATICS
PROPAEDEUTICS
Syn-Sem Interface
Temporal Logic
Blog Tools
Edit your Blog
Build a Blog
RSS Feed
View Profile
Translate this
INTO JAPANESE
BROTHER BLOG
MAIEUTIKOS
LINGUISTIX&LOGIK, Tony Marmo's blog
Wednesday, 6 June 2007

Topic: HUMAN SEMANTICS

Context, Content and Relativism


By Michael Glanzberg

Here is a simple and inviting picture: the semantic values of sentences, relative to contexts,are sets of possible worlds. These are the truth conditions of assertions of those sentences in contexts. They are thus the contents of assertions, or the objects of attitudes we might take towards such contents.
There have been many questions raised about the simple picture. I propose to ignore these questions to focus on whether the semantic values of sentences hould be sets of something more than possible worlds.
My main concern here shall be with the philosophy of language side of this debate. I shall argue that in fact, thinking about the way language works does not give us any argument for relativism. I shall also suggest, in the end, that the argument which leads to this kind of rampant relativism hinges on a particularly stringent view about the way context fixes contextual parameters. I shall suggest this stringent view is not well-justified, and that language shows us many contextual effects which do not conform to it. This will not constitute a knock-down argument against relativism, but I do hope to show that sober reflection on language offers relativism no support.

Source: Semantics Archive 


Posted by Tony Marmo at 16:18 BST
Updated: Wednesday, 6 June 2007 16:33 BST
Thursday, 23 November 2006

Topic: HUMAN SEMANTICS

Evidentiality, Modality and Probability


By Eric McCready & Norry Ogata

We show in this paper that some expressions indicating source of evidence are part of propositional content and are best analyzed as a special kind of epistemic modal. Our evidence comes from the Japanese evidential system. We consider six evidentials in Japanese, showing that they can be embedded in conditionals and under modals and that their properties with respect to modal subordination are similar to those of ordinary modals. We show that these facts are difficult for existing theories of evidentials, which assign evidentials necessarily widest scope, to explain. We then provide an analysis using a logical system designed to account for evidential reasoning; this logic is the first developed system of probabilistic dynamic predicate logic. This analysis is shown to account for the data we provide that is problematic for other theories.

Keywords: evidentiality, modality, probability, Japanese, dynamic semantics

Source: Semantics Archive

 



Posted by Tony Marmo at 00:01 GMT
Updated: Saturday, 25 November 2006 19:48 GMT
Friday, 29 September 2006

Topic: HUMAN SEMANTICS

Induction and Comparison


By Paul Pietroski

Logical induction may be important for theoretical linguistics, even if children do not induce languages from experience. Either our human capacities for inductive reasoning lie near the heart of our capacity to generate and understand expressions of a human language, or not. If they do, then theoretically minded linguists should try to understand human inductive capacities and the kinds of understanding they make possible, independent of other cognitive capacities. If not, then we should be clear about this, and not pretend otherwise-say, by adopting semantic theories that exploit the full resources of the logic that Frege used to reduce arithmetic to Hume's Principle. But suppose our best theories of language do presuppose that speakers have inductive capacities. Then considerations of theoretical parsimony suggest that we theorists should squeeze as much as we can from our representations of human inductive capacities, before adding controversial assumptions about how speakers understand expressions. This leaves room for hypotheses according to which speakers understand certain sentences in terms of covert quantification over abstracta.

Source: Semantics Archive


 


Posted by Tony Marmo at 14:58 BST

Now Playing: Reposted
Topic: HUMAN SEMANTICS

Quantification and Second-Order Monadicity


By Paul M. Pietroski

Once we grant that grammatical structure can be as complicated as logical structure, and just as distant from audible features of word strings, we can approach the study of human cognition by combining the insights of modern logic (and not just its first-order fragment) and linguistics. Those deciding where to invest might want to compare this project, in terms of the results it has delivered and its potential for delivering more in the foreseeable future, with alternative projects that philosophers of language and mind have been pursuing. My bias in this regard will be evident. Though a more dispassionate assessment might lead to much the same conclusion: for now, our best hope of learning something important about the structure of thought-and giving substance to the ancient idea of language as a mirror of the mind-lies with figuring out how Frege, Chomsky, Montague, Davidson, and many others could each be importantly right, and no doubt wrong, about the same thing: namely, the shared syntactic/semantic structure of our sentences/thoughts. My suggestion is that this structure is more conjunctive, monadic, and second-order than one might think.

Source: Semantics Archive


 


Posted by Tony Marmo at 00:01 BST
Updated: Friday, 29 September 2006 15:00 BST
Saturday, 8 July 2006

Topic: HUMAN SEMANTICS

Scopal Independence:

A Note on Branching and Wide Scope Readings of Indefinites and Disjunctions




By Philippe Schlenker

Hintikka claimed in the 1970s that indefinites and disjunctions give rise to 'branching readings' that can only be handled by a 'game-theoretic' semantics as expressive as a logic with (a limited form of) quantification over Skolem functions. Due to empirical and methodological difficulties, the issue was left unresolved in the linguistic literature. Independently, however, it was discovered in the 1980s that, contrary to other quantifiers, indefinites may scope out of syntactic islands. We claim [here] that branching readings and the island-escaping behavior of indefinites are two sides of the same coin: when the latter problem is considered in full generality, a mechanism of 'functional quantification' (Winter 2004) must be postulated which is strictly more expressive than Hintikka's, and which predicts that his branching readings are indeed real, although his own solution was insufficiently general. Furthermore, we suggest that, as Hintikka had seen, disjunctions share the behavior of indefinites, both with respect to island-escaping behavior and (probably) branching readings. The functional analysis can thus naturally be extended to them.

Source: Institut Jean-Nicod.
To appear in Journal of Semantics.


Posted by Tony Marmo at 03:43 BST
Updated: Saturday, 8 July 2006 03:56 BST
Thursday, 15 June 2006

Topic: HUMAN SEMANTICS

Meaning and Dialogue Coherence: A Proof-theoretic Investigation


By Paul Piwek

This paper presents a novel proof-theoretic account of dialogue coherence. It focuses on cooperative information-oriented dialogues and describes how the structure of such dialogues can be accounted for in terms of a multi-agent hybrid inference system that mixes natural deduction with information transfer and observation. We show how the structure of dialogue arises out of the interplay between the inferential roles of logical connectives (i.e., sentence semantics), a rule for transferring information between agents, and rules for information flow between agents and their environment. Our order of explanation is opposite in direction to that adopted in the game-theoretic semantics tradition, where sentence semantics (or a notion of valid inferences) is derived from (winning) dialogue strategies. The approaches may, however, be reconcilable, since we focus on cooperative dialogues, whereas the latter concentrates on adversarial dialogue.

Keywords: natural deduction, dialogue, coherence, hybrid inference


In: Proceedings of ESSLLI Workshop on Coherence in Generation and Dialogue, M'alaga, Spain, 2006, pp. 57-64.
Source: Semantics Archive

Posted by Tony Marmo at 16:31 BST
Monday, 5 June 2006

Topic: HUMAN SEMANTICS

On Aristotle and Baldness- Topic, Reference, Presupposition, and Negation


By Johan Brandtler


This paper is a contribution to the never settled debate on reference, negation and presupposition of existence in the linguistic/philosophical literature. Based on Swedish and English data, the discussion is an attempt to present a unified account of the opposing views put forward in the works of Aristotle, Frege (1892), Russell (1905) and Strawson (1950). The starting point is the observed asymmetry in Swedish (and English) that negation may precede a quantified subject NP in the first position, but not a definite subject NP or a proper name. This asymmetry is argued to be due to semantic, rather than syntactic, restrictions. In the model proposed here, negating a topic NP affects the “topic selection”. This is allowed with quantified NPs, since negating a quantifier leads only to a modification of the topic selection. For definite/generic subject NPs this cannot be allowed, since negating a definite NP equals cancelling the topic selection. This leads to a ‘crash’ at the semantic level.

keywords: negation, presupposition, reference, topic, aristotle, frege, russell, strawson, quantifiers, semantics

Published in: Working Papers in Scandinavian Syntax, volume 77 (2006), 177-204. Lund University, Sweden.

Source: LingBuzz/000281


Posted by Tony Marmo at 18:32 BST
Friday, 5 May 2006

Topic: HUMAN SEMANTICS

A game-theoretic account of implicature


By Prashant Parikh

I use game theory, decision theory, and situation theory to model a class of implicatures. Two types of relevance are distinguished and used to construct a model of Gricean communication between speaker and addressee.

Source: Proceedings of the 4th conference on Theoretical aspects of reasoning about knowledge, Monterey, California, Pages: 85 - 94 Year of Publication: 1992

Posted by Tony Marmo at 16:46 BST
Updated: Friday, 5 May 2006 16:50 BST
Friday, 10 March 2006

Topic: HUMAN SEMANTICS

Davidson's Criticism of the Proximal Theory of Meaning


By Dirk Greimann

According to the proximal theory of meaning, which is to be found in Quine’s early writings, meaning is determined completely by the correla-tion of sentences with sensory stimulations. Davidson tried to show that this theory is untenable because it leads to a radical form of skepticism. The present paper aims to show, first, that Davidson’s criticism is not sound, and, second, that nonetheless the proximal theory is untenable because it has a very similar and equally unacceptable consequence: it implies that the truth-value of ordinary sentences like ‘Snow is white’ is completely determined by the properties of the speaker, not by the properties of the objects to which these sentences refer.

Appeared in Principia - An International Journal Of Epistemology, volume 9, n. 1-2, p. 73-86, 2005.


Posted by Tony Marmo at 16:25 GMT
Monday, 6 March 2006

Topic: HUMAN SEMANTICS

Universal versus Existential Quantifiers


The Russian vsjakij


By Georgy Bronnikov

The quantifier vsjakij has drawn considerable attention from semanticists in the Russian tradition. This article proposes an analysis based on the morphological structure of the word, using Carlson’s (1977) theory of kind reference. The result is an account that allows us to give a unified treatment to generic [sic] and existential uses of vsjakij, which, to my knowledge, has never been done before. There remain a number of problematic cases; those are noted and, where possible, analyzed as well. If the proposed account is correct, vsjakij turns out to be a near-exception to a well-known universal stating that no language has determiners specialized for kind reference (see, for example, Gerstner-Link and Krifka 1995, p. 967, Dayal 2004, p. 394).

Source: Semantics Archive


Posted by Tony Marmo at 14:24 GMT
Updated: Monday, 6 March 2006 14:28 GMT
Thursday, 16 February 2006

Topic: HUMAN SEMANTICS

A PUZZLE ABOUT MODALITY


One of the curious properties of the Modal Semantics of Human Languages is that they do have a T principle and a Necessitation rule, and even Aristotle’s law, but lack banalisation or collapse.
The T principle may be stated as follows:
T A⊃ A

In Logic the Necessitation Rule requires that if A is a thesis of a certain modal system S, then A is also a thesis of the S. In the study of Natural Languages one may, for instance, think of a variation of Necessitation by simply saying that if a string or sentence that expresses a proposition is a sentence of a language, then the sentence that expresses that the same proposition is necessary must also be a sentence of the same language. But that would be a very basic and elementary principle of natural language.
[READ MORE]

Posted by Tony Marmo at 16:45 GMT
Thursday, 1 December 2005

Topic: HUMAN SEMANTICS

Dynamic Situations: Accounting for Dowty’s Inertia Notion Using Dynamic Semantics


By Ido Ben-Zvi

The theory I advocate is three fold. First, while trying to follow closely in the footsteps of Dowty’s intuitively appealing concept of inertia (the idea of ‘things going on in a normal fashion’), I hold that the modal basis for this concept is epistemic and not ontological. This may seem to be in line with Dowty’s own theory, at least with that fuzzy part about things going on normally. But I will show that Dowty’s modality is either completely ontological, in which case it does not provide the required results, or else is an inconsistent mix up of an ontological and an epistemic base.
Second, I hold that the notion of partiality plays a critical role in the semantics of the progressive. I think that at the intuitive level this too is an enticing conviction. The progressive appears to be a kind of commonsensical projection of what we know on to the parts of reality of which we do not know. Thus the zebra may truly be said to be finishing off the greenery if its (or our) partial knowledge does not include data about the approaching feline death. In trying to analytically bite off a chunk from the vague notion of normality I will take partiality a step further and use it to formally explain what it means for nothing unexpected or out of the ordinary to happen. This is a particularly difficult notion to catch formally because of the double use of negation: not only are we after those ‘things’ which are un-expected, but also are we interested in those cases where they don’t happen.
This leads us to the third pillar on which this thesis rests. Partiality will give us na explanation of what the unexpected happenings are, and my third point is that built into the progressive operator is a kind of minimality constraint. Being interested only in those cases where nothing unexpected happens means throwing away all those cases where something superfluous does happen if we can also imagine a similar case where it does not. Once again, my aim is to crystallize this intuition in a formal way.


Keywords: progressive imperfective dynamic semantics situations


Source: Semantics Archive



Posted by Tony Marmo at 16:42 GMT
Updated: Thursday, 1 December 2005 16:45 GMT
Tuesday, 22 November 2005

Topic: HUMAN SEMANTICS

EVENT POSITIONS: Suppression and emergence


By James Higginbotham


Donald Davidson proposed in 1967, and elaborated in subsequent work, the thesis that action predicates in natural language contain an argument position ranging over events, a position that in simple sentences was cashed out through existential quantification. As Claudia Maienborn remarks, Davidson's proposal is naturally extended from action predicates to predicates of all sorts; thus for instance I myself proposed that it extend to all heads in the X' system, including Nouns. A number of linguistic contexts, including those of causation (a relation between events), and accomplishment predicates (involving two events, as process and telos), invite us to consider event complexes. Moreover, there is reason to appeal to an ``E-position'', as I called it, within modifiers that are themselves predicates of events (I expand upon this point in section 3 below). As Maienborn appreciates, the analytic wheel has turned: instead of looking for detailed considerations that would practically compel acknowledgement of the E-position in this or that construction, we assume that the position is always available, and we take the consequences for universal language design and for language difference, both syntactic and semantic.

Appeared in: Theoretical Linguistics Vol 31, No 3 (2005)

Posted by Tony Marmo at 00:01 GMT
Updated: Saturday, 19 November 2005 05:18 GMT

Topic: HUMAN SEMANTICS

Abandoning Coreference


By Ken Safir

In order to linguistically evaluate what a sentence is permitted to mean (not what it actually means), we do not have to know what a speaker intends to say. Grammar permits us to determine a range of meanings a given coconstrual can have and compute which meanings it cannot have - the rest is not a matter for the grammar at all. In saying so, I am certainly not advocating that it is of no consequence for anybody to examine notions of what people intend to accomplish by uttering what they do - doubtless a complete picture of communicative situations requires such a project. I am explicitly arguing that the full interpretation of a sentence is something greater than the result of formal grammar. In other words, I am insisting, as Lasnik and Chomsky do, on a line between formal grammar and the uses to which the products of formal grammar are put.

To appear in Thought, Reference and Experience: Themes from the Philosophy of Gareth Evans. Ed. J. L. Bermudez. Oxford: Oxford University Press.

Posted by Tony Marmo at 00:01 GMT
Updated: Tuesday, 22 November 2005 11:41 GMT

Topic: HUMAN SEMANTICS

Linguistic Side Effects


By Chung-chieh Shan

Apparently non-compositional phenomena in natural languages can be analysed like computational side effects in programming languages: anaphora can be analysed like state, intensionality can be analysed like environment, quantification can be analysed like delimited control, and so on. We thus term apparently non-compositional phenomena in natural languages 'linguistic side effects'. We put this new, general analogy to work in linguistics as well as programming-language theory.
In linguistics, we turn the continuation semantics for delimited control into a new implementation of quantification in type-logical grammar. This graphically-motivated implementation does not move nearby constituents apart or distant constituents together. Just as delimited control encodes many computational side effects, quantification encodes many linguistic side effects, in particular anaphora, interrogation, and polarity sensitivity. Using the programming-language concepts of evaluation order and multistage programming, we unify four linguistic phenomena that had been dealt with only separately before: linear scope in quantification, crossover in anaphora, superiority in interrogation, and linear order in polarity sensitivity. This unified account is the first to predict a
complex pattern of interaction between anaphora and raised-wh questions, without any stipulation on both. It also provides the first concrete processing explanation of linear order in polarity sensitivity.
In programming-language theory, we transfer a duality between expressions and contexts from our analysis of quantification to a new programming language with delimited control. This duality exchanges call-by-value evaluation with call-by-name evaluation, thus extending a known duality from undelimited to delimited control. The same duality also exchanges the familiar 'let' construct with the less-familiar 'shift' construct, so that the latter can be understood in terms of the former.


PhD Dissertation, Harvard University

Posted by Tony Marmo at 00:01 GMT
Updated: Monday, 21 November 2005 09:04 GMT

Newer | Latest | Older