Less is more: why all paradigms are defective, and why that is a good thing

Thumbnail Image
Can’t use the file because of accessibility barriers? Contact us

Date

2019-06-21

Journal Title

Journal ISSN

Volume Title

Publisher

Abstract

Only a fraction of lexemes are encountered in all their paradigm forms in any corpus or even in the lifetime of any speaker. This raises a question as to how it is that native speakers confidently produce and comprehend word forms that they have never witnessed. We present the results of an experiment using a recurrent neural network computational learning model. In particular, we compare the model’s production of unencountered forms using two types of training data: full paradigms vs. single word forms for Russian nouns, verbs, and adjectives. In the long run, the model displays better performance when exposed to the more naturalistic training on single word forms, even though the other training data is much larger as it includes full paradigms for each and every word. We discuss why “defective” paradigms may be better for human learners as well.

Description

This record is for a(n) offprint of an article published in Corpus Linguistics and Linguistic Theory on 2019-06-21.

Keywords

Citation

Janda, Laura A., and Tyers, Francis M. "Less is more: why all paradigms are defective, and why that is a good thing." Corpus Linguistics and Linguistic Theory, 2019-06-21.

Journal

Corpus Linguistics and Linguistic Theory

DOI

Link(s) to data and video for this item

Relation

Rights

Type