WWN : language acquisition and generalization using association
Based on some recent advances in understanding and modeling cortical process- ing for space and time, we propose a developmental, general-purpose model for language acquisition using multiple motor areas. The thesis presents two main ideas: a) early language acquisition is a grounded and incremental process, i.e., the network learns as it performs in the real world b) language is a complex perceptual, cognitive and motor skill that can be acquired through associative learning and skill transfer principles. The network architecture is informed by the existing neuroanatomic studies and the associative learning literature in psychol- ogy. Through the ventral pathway, the "what" motor learns, abstracts and feeds back (as recurrent top-down context) information that is related to the meaning of the text. Via the dorsal pathway, the "where/how" motor learns, abstracts and feeds back (as top-down context) information that relates to the spatial informa- tion of text, e.g., where is the text on a page. This is a major departure from the traditional symbolic and connectionist approaches to natural language processing (NLP) -- the nature of the motor areas, i.e., actions or abstract meanings, play the role of "state hubs" in language acquisition and understanding. The "hubs" correspond to multiple concepts that form the state of the current context. As any human communicable concept can be either verbally stated (what) or demonstrated through actions (how), this model seems to be the first general purpose develop- mental model for general language acquisition, although the size of our experiments is still limited. Furthermore, unlike traditional NLP approaches, syntax is a special case of actions. The major novelty in our language acquisition is the ability to gen- eralize, going beyond a probability framework, by simulating the primary, secondary and higher order associations observed in animal learning through the generalization of internal distributed representations. A basic architecture that enables such a gen- eralization is the overall distributed representation: not only a retina image but also an array of muscles is considered high-dimensional images. An emergent internal dis- tributed representation is critical for going beyond experience to enable three types of generalization: member-to-class, subclass-to-superclass, member-to-member, and relation-specification. In our cortex inspired model, syntax and semantics are not treated differently, but as emergent behaviors that arise from grounded real-time experience.
Read
- In Collections
-
Electronic Theses & Dissertations
- Copyright Status
- In Copyright
- Material Type
-
Theses
- Authors
-
Miyan, Kajal
- Thesis Advisors
-
Weng, Juyang
- Committee Members
-
Weng, Juyang
Jin, Rong
Tong, Yiying
Liu, Xiang-Yang
- Date Published
-
2011
- Subjects
-
Computational linguisticsMore info
Language acquisitionMore info
Natural language processing (Computer science)More info
- Program of Study
-
Computer Science
- Degree Level
-
Masters
- Language
-
English
- Pages
- xi, 85 pages
- ISBN
-
9781124604954
1124604952
- Permalink
- https://doi.org/doi:10.25335/hwm0-9979