an introduction to computational psycholinguistics
play

An introduction to computational psycholinguistics: Modeling human - PDF document

An introduction to computational psycholinguistics: Modeling human sentence processing Shravan Vasishth University of Potsdam, Germany http://www.ling.uni-potsdam.de/ vasishth vasishth@acm.org September 2005, Bochum ACT-R introduction Why


  1. An introduction to computational psycholinguistics: Modeling human sentence processing Shravan Vasishth University of Potsdam, Germany http://www.ling.uni-potsdam.de/ ∼ vasishth vasishth@acm.org September 2005, Bochum ACT-R introduction Why cognitive architectures? The inspiration is Newell’s idea (Newell himself developed another architecture called SOAR, which we will not get into in this course). • Aims towards a unified theory of mind as a computational process. • Provides means for integrating assumptions about domain-specific behavior with empirically-based research on human cognition (e.g., constraints on memory, eye movements). • Provides a framework for interpreting brain imaging data. • Applications in HCI, education, simulated cognitive agents for all kinds of hazardous operations. 1

  2. The units of knowledge: Chunks (CLEAR-ALL) (CHUNK-TYPE addition-fact addend1 addend2 sum) (CHUNK-TYPE integer value) (ADD-DM (fact3+4 isa addition-fact addend1 three addend2 four sum seven) (three isa integer value 3) (four isa integer value 4) 2 Encoding sentences as linked chunks Fact: The cat sits on the mat (Add-DM (fact001 isa proposition agent cat01 action sits_on object mat) ) Fact: The black cat with five legs sits on the mat (cat01 isa cat legs 5 color black) 3

  3. Exercise (Chunk-Type proposition agent action object) (Chunk-Type professor money-status age) (Chunk-Type house kind price status) Fact: The rich young professor buys a beautiful and expensive city house. What chunks do we need to add to the declarative memory? 4 Productions: The unit of processing • Condition-action control structure. • Has a default execution time of 50 milliseconds. • Presents a serial bottleneck in an otherwise parallel system. • Symbolic realization of the flow of information from cortex to basal ganglia and back. 5

  4. The syntax of production rules Syntax: (p name (p Production-name Specification of buffer Conditions tests ==> delimiter Specification of buffer Actions ) transformations) 6 An example of a production rule (p predict-an-IP =retrieval> isa determiner pronounciation ‘‘a’’ ... [other relevant features of the determiner] ==> +retrieval> isa noun-prediction number singular ;;; we call this a retrieval cue ... ) 7

  5. Buffers in ACT-R 1. Goal buffer: represents one’s current state in a task, and preserves information across productions. 2. Retrieval buffer: information retrieved from DM stored here; locus of activation computations 3. Visual buffers 4. Auditory buffers 5. Manual buffers 6. Vocal buffers 8 Chunk Activation: The subsymbolic level X X W j × S ji + MP k × Sim kl + N ( , s ) A i = B i + (1) j k • A i : The activation of a chunk i • B i : Base-level activation of i , reflects past usefulness • P W j × S ji : Associative activation, relevance to the general context j • P MP k × Sim kl : Mismatch penalty k • N (0 , s ) : Noise. 9

  6. Mapping activation to retrieval time Retrieval Time i = F × e − Ai (2) Among the chunks that match a retrieval request, the one with the highest activation is retrieved provided it exceeds a predefined threshold τ . 10 Base-level activation and associative activation 0 1 n − d X X B i = ln A i = B i + (3) t j W j S ji @ A j =  j Chunks are retrieved by a content-addressed, associative retrieval process. Associative retrieval interference arises because the strength of association from a cue is reduced as a function of the number of items associated with the cue. This is captured by Equation 4, which reduces the maximum associative strength S by the log of the “fan” of item j , i.e., the number of items associated with j . S ji = S − ln ( fan j ) (4) 11

  7. Base-level activation and associative activation 12 Base-level activation and associative activation ! n − d P A i = B i + P B i = ln t j W j S ji j =  j Activation Retrievals Decay 1.5 1.0 Interference 0.5 13

Recommend


More recommend