Empiricism and language learnability

Authors Chater, Nick and Clark, Alexander and Goldsmith, John A and Perfors, Andy Year 2015 Abstract his interdisciplinary new work explores one of the central theoretical problems in linguistics: learnability. The authors, from different backgrounds - linguistics, philosophy, computer science, psychology and cognitive science-explore the idea that language acquisition proceeds through general purpose learning mechanisms, an approach that is broadly empiricist both methodologically and psychologically. For many years, the empiricist approach has been taken to be unfeasible on practical and theoretical grounds. [Read More]

Learnability

Authors Clark, Alexander Year 2015 Abstract Reviews of learnability in linguistics focus on negative results, with the nativists stressing the negative results and the researchers of a more empiricist persuasion downplaying them. This chapter discusses the theory of learnability or grammatical inference, from a positive perspective. It focuses on the methodological issues involved in applying the tools of mathematical analysis to the empirical problem of language acquisition, and the various assumptions that one make, and by discussing the problems of grammatical inference. [Read More]

An Algebraic Approach to Multiple Context-Free Grammars

Authors Clark, Alexander and Yoshinaka, Ryo Year 2014 Abstract We define an algebraic structure, Paired Complete Idempotent Semirings (pcis), which are appropriate for defining a denotational semantics for multiple context-free grammars of dimension 2 (2-mcfg). We demonstrate that homomorphisms of this structure will induce well-behaved morphisms of the grammar, and generalize the syntactic concept lattice from context-free grammars to the 2-mcfg case. We show that this lattice is the unique minimal structure that will interpret the grammar faithfully and that therefore 2-mcfgs without mergeable nonterminals will have nonterminals that correspond to elements of this structure. [Read More]

An introduction to multiple context free grammars for linguists

Authors

Clark, Alexander

Year

2014

Abstract

This is a gentle introduction to Multiple Context Free Grammars (mcfgs), intended for linguists who are familiar with context free grammars and movement based analyses of displaced constituents, but unfamiliar with Minimalist Grammars or other mildly context-sensitive formalisms.

link

local copy of pdf

citation in bibtex

Distributional learning of parallel multiple context-free grammars

Authors Clark, Alexander and Yoshinaka, Ryo Year 2014 Abstract Natural languages require grammars beyond context-free for their description. Here we extend a family of distributional learning algorithms for context-free grammars to the class of Parallel Multiple Context-Free Grammars (PMCFGs). These grammars have two additional operations beyond the simple context-free operation of concatenation: the ability to interleave strings of symbols, and the ability to copy or duplicate strings. This allows the grammars to generate some non-semilinear languages, which are outside the class of mildly context-sensitive grammars. [Read More]

Complexity in Language Acquisition

Authors Clark, Alexander and Lappin, Shalom Year 2013 Abstract Learning theory has frequently been applied to language acquisition, but discussion has largely focused on information theoretic problems—in particular on the absence of direct negative evidence. Such arguments typically neglect the probabilistic nature of cognition and learning in general. We argue first that these arguments, and analyses based on them, suffer from a major flaw: they systematically conflate the hypothesis class and the learnable concept class. [Read More]

Learning Trees from Strings: A Strong Learning Algorithm for some Context-Free Grammars

Authors Alexander Clark Year 2013 Abstract Standard models of language learning are concerned with weak learning: the learner, receiving as input only information about the strings in the language, must learn to generalise and to generate the correct, potentially infinite, set of strings generated by some target grammar. Here we define the corresponding notion of strong learning: the learner, again only receiving strings as input, must learn a grammar that generates the correct set of structures or parse trees. [Read More]

{The syntactic concept lattice: Another algebraic theory of the context-free languages?}

Authors Clark, Alexander Year 2013 Abstract {The syntactic concept lattice is a residuated lattice associated with a given formal language; it arises naturally as a generalization of the syntactic monoid in the analysis of the distributional structure of the language. In this article we define the syntactic concept lattice and present its basic properties, and its relationship to the universal automaton and the syntactic congruence; we consider several different equivalent definitions, as Galois connections, as maximal factorizations and finally using universal algebra to define it as an object that has a certain universal (terminal) property in the category of complete idempotent semirings that recognize a given language, applying techniques from automata theory to the theory of context-free grammars (CFGs). [Read More]

Beyond Semilinearity: Distributional Learning of Parallel Multiple Context-free Grammars

Authors Clark, Alexander and Yoshinaka, Ryo Year 2012 Abstract Semilinearity is widely held to be a linguistic invariant but, controversially, some linguistic phenomena in languages like Old Georgian and Yoruba seem to violate this constraint. In this paper we extend distributional learning to the class of parallel multiple context-free grammars, a class which as far as is known includes all attested natural languages, even taking an extreme view on these examples. [Read More]