A word-experience model of Chinese character learning

From Pslc
Jump to: navigation, search


Computational models of learning can advance the science of learning by making explicit assumptions about the learning process, thereby providing model-based hypotheses that can be tested by data. Our intention is to demonstrate how a model that we have developed for English word reading (Reichle & Perfetti, 2003) can be applied to learning to read characters in Chinese. The basic model assumptions are: (1) that the ability to read words is acquired on a word-by-word basis, and (2) that the generalized (robust) learning of a word is the result of many different encoding (reading) contexts whose variability becomes less important with repeated encounters. On this account, a robust, context-general word representation results from an extraction of a stable form from its many experienced variations. The application of this model to Chinese character learning is especially compelling because, more than with alphabetic learning, which provides generalizations across words, Chinese learning is character-by-character. The proposed modeling project will thus result in a computational framework for examining the consequences of contextual variability on the learning and retention of lexical information in a specific academic domain—the acquisition of a second language. The project will also provide a more general analytical framework for examining the factors that contribute to robust learning across many domains.

Although the micro level theoretical framework of the PLSC assumes that each knowledge component has a strength and a set of retrieval features, and repeated applications of a knowledge component cause its strength to increase and its retrieval features to generalize, this model offers an alternative, instance-based account. It assumes that each application of a knowledge component creates a new instance of it in memory and that the encoding includes the contextual features active in working memory at the time of application.