Class 11: June 20
Topic: Verbal decomposition
Draw the trees for the following two sentences, and explain how your solution accounts for the observed readings of wieder
(1) ...dass ich die Tür wieder öffnete.
(2) ...dass ich wieder die Tür öffnete.
If you need a hint, you may want to consult the following paper.
Class 10: June 13
Topic: Agreement (IV): split agreement; position-dependent agreement; the clitic/agreement distinction
- Rezac, Milan. 2004. Elements of cyclic syntax. Doctoral dissertation, University of Toronto. (read section 2.3: "Cyclic expansion: a case study in Basque agreement displacement")
Class 9: June 6
Topic: Agreement (III): context sensitive agreement
- Bejar, Susana. 2003. Phi syntax: a theory of agreement. Doctoral dissertation, University of Toronto. (read chapter 2, "Features, underspecification and locality". It's a very abstract chapter, so focus on understanding the idea of decomposing person and number into more fundamental features)
Class 8: May 30
Topic: Agreement (II): defective intervention effects
- Rezac, Milan. 2004. Elements of cyclic syntax. Doctoral dissertation, University of Toronto.
Sections 1.1--1.4 are a very close reading of the two papers by Chomsky ("Derivation by phase" and "Minimalist Inquiries") where the Agree system was originally proposed. These pages should answer all your technical questions about how Agree is supposed to work.
During this week, we will focus mostly on section 2.2, "Cyclic displacement and agreement cyclicity".
Class 7: May 23
Topic: Agreement and Case (I)
- Wurmbrand, Susi. 2006. Licensing Case. Journal of Comparative Germanic Linguistics 18:175--236 (pay special attention to section 2, "Agree without movement")
In general, if you have an interest in the structure of German verb phrases, you should spend some time reading Wurmbrand's work, which you can access at this address.
Class 6: May 16
Topic: Movement and locality (III)
Class 5: May 9
Topic: Movement and locality (II)
Class 4: May 3
Topic: Movement and locality (I)
Class 3: April 25
Topic: The copy theory of movement: some evidence
Required reading: Nunes, Jairo. 2004. Sideward movement and linearization of chains. Cambridge: MIT Press (chapter 1).
Class 2: April 19
Topic: phrase structure and linearization
Required reading: Epstein, Samuel, Erich Groat, Kawashima, and Hisatsugu Kitahara. 1998. A derivational approach to syntactic relations. Oxford: Oxford University Press (chapter 1).
Summary of the class:
- Kayne, Richard. The Antisymmetry of Syntax. Cambridge: MIT Press (the link contains the first 15 pages; the rest of the book is an extended footnote to that, but still interesting to read).
- Chametzky, Robert. 1996. Phrase Structure: from GB to Minimalism. Oxford: Blackwell (detailed tutorial on various versions of the phrase structure system).
- Kornai, Andras, and Geoffrey Pullum. 1990. The X-bar theory of phrase structure. Language 66:24-50 (very dense and theory-heavy paper).
Class 1: April 12
Topic: Syntactic theory in four easy steps
Reading: no required reading, since this was the first class. If you need further information about the concepts we have talked about in this class, you can try any standard textbook in syntax. Recommendations include:
- Haegeman, Liliane. 1994. Introduction to Government and Binding Theory. Oxford: Blackwell (a bit outdated, but still very useful).
- Radford, Andrew. 1997. Syntactic theory and the structure of English. Cambridge: Cambridge University Press. (also slightly outdated, but with good introductory chapters)
- Adger, David. 2003. Core Syntax. Oxford: Oxford University Press.
- Carnie, Andrew. 2006. Syntax: a generative introduction. Oxford: Blackwell.
- Grewendorf, Günther. 2002. Minimalistische Syntax. Tübingen: A. Francke Verlag.
- Johnson, Kyle. 2010. Introduction to Transformational Grammar. Ms. University of Massachusetts, Amherst.
In addition, at some point you should try to have a look at both Syntactic Structures
(Chomsky 1957) and Aspects of the Theory of Syntax
(Chomsky 1965). We will cover some of Chomsky's more recent writings as we go along.
Summary of the class
A good syntactic theory is one that allows us to make internally consistent analyses of a great number of constructions. By this, I mean that it allows us to get new insights into the constructions themselves and make predictions about other aspects of the language. In this course, we are going to cover what is usually called Transformational Grammar
, especially its most recent version, known as Minimalism
. We will have time to geek out on more advanced stuff later on in the course, but today we are going to start with the very basics.
Any syntactic theory, if it wants to be successful, has to have four components, which I list below. The differences between theories lie on the particular assumptions we make about each of these components. Whenever I am discussing an assumption specific to Minimalism (not shared by other frameworks), I will explicitly indicate so.
Component #1: features.
Some people are likely to think that the basic units of a syntactic theory are words, or morphemes. This is correct in some theories, but in Minimalism, the basic unit is the feature
, which by convention we enclose in [square brackets]. A feature is, by definition, an indivisible bit of information about syntax. If you find and "feature" that can be subdivided into two or more sub-features, what you have in your hands is not a feature, but rather a feature matrix
(plural: matrices), also called a feature set or a feature bundle. A syntactic feature is whatever kind of information that is relevant for syntax ---e.g., number, person, tense, aspect, animacy, etc.
We need to say a bit more about how information is encoded inside each feature. I am going to present three different (non-equivalent) systems. In this course, we will use either binary or privative features. I will let you know if and when the difference between the two is theoretically significant.
Component #2: the lexicon.
Component #3: the computational system.
Component #4: interfaces.
- Multivalued features: features in this system consist of an attribute (the "name" of the feature) and a value; the notation is [attribute:value]. As its name indicates, features in this system can have an arbitrary number of values. For instance, [person:] can be [person:1], [person:2], or [person:3] (or even more values in languages with an inclusive/exclusive distinction). Values can vary freely across features.
- Binary features: this is a variant of the system above in which every feature has exactly two values, which remain constant across features: plus or minus. A very basic example is the feature [animate], for which we have the variants [+animate] and [-animate]. In order to translate multivalued features into a binary system, we need to break them down into more fundamental features. For instance, [person] is reanalyzed as a combination of [part] (participant in the conversation) and [ad] (addressee). The combination of these two features gives us a four-cell grid.
|[-ad]||1st person||3rd person
I need to mention a couple of things here. First, this way of decomposing multivalued features occasionally leaves blank cells (in general, whenever the multivalued feature in question has an odd number of values). Consider the upper right cell in the table above: it's empty because one cannot be an addressee if he's not also a participant in the conversation. Second, sometimes it is not clear how a multivalued feature is going to be decomposed into two or more binary features. Some decompositions (such as person) are well accepted and empirically supported, but some of them are more controversial.
Privative features: this is an even more radical reduction in the amount of information that the feature carries, as we are eliminating the value completely. The attribute itself is the feature, and its absence is equivalent to a negative value.
General information about this class.
- Lecturer: Luis Vicente
- Office hours: Monday, Tuesday, 12:00--15:00, or by email appointment.
- Credit: 6LP/ETCS.
- How to pass this class: if you are taking this class for credit, your final grade will be calculated as follows:
|Attendance and participation||20%|
|Squib 1 (due May 9)||20%|
|Squib 2 (due June 13)||20%|
|Term paper (due August 30)||40%|
A squib is a very short paper (ideally 2 pages; 4 pages is an absolute maximum). A pretty good definition of what a squib should be appears in the Editorial Statement of the Snippets journal:
Squibs are to be brief, self-contained and explicit. They may do any of the following things:
The final paper will have to be between 15 and 20 pages long (A4, 2.5cm margins, 12pt font single-spaced). With these specifications, you'll have to write between 7000 and 10000 words. In addition, you will also have to give a short presentation (15 mins) of your work in progress during the last two weeks of the course. Please consult me as soon as possible if you need assistance with your term paper.
- point out an empirical phenomenon that goes against accepted generalizations or that shows that some aspect of a theory is problematic;
- point out unnoticed minimal pairs that fall outside the scope of any existing theory;
- point out an empirical phenomenon that confirms the predictions of a theory in an area where the theory has not been tested;
- explicitly describe technical inconsistencies in a theory or in a set of frequently adopted assumptions;
- explicitly describe unnoticed assumptions that underlie a theory or assumptions that a theory needs to be supplemented with in order to make desired predictions;
- call attention to little-known or forgotten literature in which issues of immediate relevance are discussed.
The earliest Linguistic Inquiry squibs exemplify the kind of note we would like to publish. Some of them posed unobserved puzzles. For instance, a squib by Postal and Ross in LI 1:1 ("A Problem of Adverb Preposing") noted that whether or not we can construe a sentence-initial temporal adverb with an embedded verb depends on the tense of the matrix verb. A squib by Perlmutter and Ross in LI 1:3 ("Relative Clauses with Split Antecedents"), challenging the prevailing analyses of coordination and extraposition, noted that conjoined clauses neither of which contain a plural noun phrase can appear next to an "extraposed" relative that can only describe groups. Other squibs drew attention to particular theoretical assumptions. For instance, a squib by Bresnan in LI 1:2 ("A Grammatical Fiction") outlined an alternative account of the derivation of sentences containing believe and force, and asked whether there were principled reasons for dismissing any of the underlying assumptions (among them that semantic interpretation is sensitive to details of a syntactic derivation). A squib by Zwicky in LI 1:2 ("Class Complements in Phonology") asked to what extent phonological rules refer to complements of classes. None of these squibs was more than a couple of paragraphs; all of them limited themselves to a precise question or observation.