794B Learnability and Syntax

Fall 2007


Th 2:30-5:15 - Herter 106


Rajesh Bhatt []

Thomas Roeper []


This course will run on several levels. Our general goal is to consider how concepts from learnability are relevant to current linguistic theories. First, there will be an ongoing tutorial as part of the course on the mathematical concepts that are utilized. Second, we will consider the classic logic of subset theory (Berwick), absence of negative evidence, the role of syntactic and semantic triggers. We will look at these in light of modern theories of Multiple Grammars (Kroch, Roeper, Yang), Optimality theory, and Minimalism. Theories, evidence, and empirical techniques have provided a much subtler range of syntactic, semantic, acquisition and parsing evidence than was available when learnability theories were first developed. How far does the evidence fit predictions and how much can learnability concepts contribute to new work? Third, we will consider new approaches to the nature of the input, algorithms that respond to frequency, and memory assumptions (Allen, Roeper, others). We will consider these in light of non-linguistic machine learning theories and how they might work in current theoretical models or how one might develop theories that utilize both UG and data assimilation (for instanceYang). Students will be expected to report on papers from the literature (See Bertolo, Fodor and Sakas, Gibson and Wexler) and possibly to carry out small projects involving developing analytic algorithms. We intend to have visitors and guest speakers from other departments.