Skip to main content
eScholarship
Open Access Publications from the University of California

Cognitive Primitives and Bayesian Number Word Learning

Abstract

We use the computational Bayesian learning model in Piantadosi, Tenenbaum, and Goodman (2012) to explore how different combinations of cognitive primitives and frequency distributions affect the learning of natural numbers. We find that the model converges on the natural numbers through attested developmental stages only under very restricted sets of primitives and frequency distributions. Assuming the size principle familiar from Bayesian approaches to inductive generalization, it would be natural to conclude that there are sharp constraints on the primitives out of which humans build natural numbers, some of which we hope to elucidate below.

Main Content
For improved accessibility of PDF content, download the file to your device.
Current View