asymmetries in brain organization for sign language karen
play

Asymmetries in brain organization for sign language Karen Emmorey - PowerPoint PPT Presentation

Asymmetries in brain organization for sign language Karen Emmorey San Diego State University Overview What determines asymmetries in the neural organization for language? Is left-hemisphere dominance found for signed languages?


  1. Asymmetries in brain organization for sign language Karen Emmorey San Diego State University

  2. Overview • What determines asymmetries in the neural organization for language? – Is left-hemisphere dominance found for signed languages? • Asymmetries in neural organization specific to sign language – Facial expression perception – Spatial language – Sensory-motoric iconicity (sign vs. pantomime) • Anatomical asymmetries due to deafness or experience with sign language – Auditory cortices – Motor cortices

  3. Some potential determinants of hemispheric asymmetries • Linguistic functions are left dominant – Sign languages exhibit the same linguistic structure as spoken languages (including phonology) • Spatial functions are right dominant – Sign languages depend on spatial contrasts at all linguistic levels • Rapid temporal processing is left dominant (P. Tallal) – Phonological transitions in sign are five times slower than in speech (200 ms vs. 40 ms)

  4. Left hemisphere damage leads to sign language aphasia Left-Hemisphere Control Deaf Right-Hemisphe Damaged Signers Signers Damaged Signe Rating Scale of Sign Characteristics Rating Scale of Sign Characteristics Rating Scale of Sign Characteristics Melodic Line Melodic Line Melodic Line Sign Finding Sign Finding Sign Finding Articulatory Agility Articulatory Agility Articulatory Agility Grammatical Form Grammatical Form Grammatical Form Paraphasia in Paraphasia in Paraphasia in Running Sign Running Sign Running Sign Sign Finding Sign Finding Sign Finding Sign Comprehension Sign Comprehension Sign Comprehension Poizner et al., (1987) From Bellugi and Hickok (1995)

  5. BUT: Neville et al. (1998) PNAS Left lateralized activation Bilateral activation for reading English for viewing ASL

  6. Comparing audio-visual speech and British Sign Language comprehension Watching a Watching/listening to an BSL Signer English speaker Bilateral activation for both sign and speech comprehension, but Left > Right MacSweeney et al., (2002) Brain

  7. Auditory regions (including Wernicke’s area) are engaged during sign perception Watching a Regions of common BSL Signer activation for A-V English and BSL MacSweeney et al., (2002) Brain

  8. Sign Language Production Picture Broca’s area Naming (left inferior Nouns frontal gyrus Verbs Prepositions Emmorey et al. 2002, NeuroImage 2003, Neuropsych 2004, Brain & Lan Left Hemisphere Right Hemisphere

  9. Left hemisphere activation for both right- handed and left-handed signing Corina et al. (2003) Journal of Cognitive Neuroscience

  10. The left hemisphere is dominant for both sign and speech • Classic left hemisphere language areas (e.g., Broca’s area and Wernicke’s area) are involved in sign language production and comprehension • Left lateralization is stronger for language production than comprehension for both speech & sign • Left hemisphere specialization for language is not tied to the properties of speech

  11. The neural systems underlying facial expression recognition Stephen McCullough, UCSD

  12. Linguistic Facial Expressions Topic marker Yes/No question Conditional clause marker t q cond GO PAPER PAPER RAIN “If it rains, we’ll leave.” Adverbial markers “recently” “easily” cs mm NOW WRITE Baker & Cokely, 1980

  13. fMRI Study Design Study Design fMRI  Task  Task: same / different judgments : same / different judgments   Experimental task: same / different Experimental task: same / different Facial Expressions Facial Expressions Linguistic Emotional Linguistic Emotional MM MM Happy Angry MM MM Happy Angry “Same Same” ” “Different Different” ” “ “   Baseline task: same / different Baseline task: same / different Gender Gender Male Male Female Male Male Male Female Male “Same Same” ” “Different Different” ” “ “ McCullough, Emmorey, & Sereno (2005) Cognitive Brain Research

  14. Stimuli: Face Only Face Only Stimuli: Emotional Expressions Emotional Expressions happy sad angry surprised disgust fear happy sad angry surprised disgust fear Linguistic Expressions Linguistic Expressions MM CS TH INT PUFF PS MM CS TH INT PUFF PS “easily easily” ” “ “recently recently” ” “ “carelessly carelessly” ” “ “intense intense” ” “ “a lot a lot” ” “ “smoothly smoothly” ” “

  15. Stimuli: Face with Sign Face with Sign Stimuli: Linguistic Emotional Baseline Linguistic Emotional Baseline neutral neutral MM MM surprised surprised DISCUSS RUN STUDY DISCUSS RUN STUDY “discuss (no expression) discuss (no expression)” ” “run easily run easily” ” “study (with surprise) study (with surprise)” ” “ “ “ McCullough et al. (2005) Cognitive Brain Research

  16. Deaf Subjects Hearing Subjects Deaf Subjects Hearing Subjects  10 native ASL signers  10 non-signers   10 native ASL signers 10 non-signers  5 women, 5 men  5 women, 5 men   5 women, 5 men 5 women, 5 men  Deaf from birth  Normal hearing   Deaf from birth Normal hearing  Right handed  Right handed   Right handed Right handed  Mean Age = 29.4 years  Mean Age = 24.2 years   Mean Age = 29.4 years Mean Age = 24.2 years McCullough et al. (2005) Cognitive Brain Research

  17. Regions of Interest Superior Temporal Suclus (STS) Right STS Right FG L R Fusiform Gyrus (FG)

  18. Hemisphere Laterality Index: Superior Temporal Sulcus 0.8 Rightward bias 0.6 * 0.4 0.2 Hearing Deaf Hearing 0 Deaf -0.2 -0.4 * Leftward bias -0.6 Emotional Linguistic Emotional Linguistic Face Only Face with verb Emotional Linguistic Emotional Linguistic McCullough et al. (2005) Cognitive Brain Research

  19. Linguistic Facial Expressions: Face with verb Linguistic Facial Expressions: Face with verb Linguistic Facial expression Gender Hearing Deaf McCullough et al. (2005) Cognitive Brain Research

  20. Hemisphere Laterality Index: Fusiform Gyrus 0.2 Rightward bias 0.1 Hearing 0 Deaf -0.1 Deaf -0.2 Hearing Hearing -0.3 -0.4 * -0.5 * Leftward bias -0.6 Emotional Linguistic Emotional Linguistic Face Only Face with verb Emotional Linguistic Emotional Linguistic McCullough et al. (2005) Cognitive Brain Research

  21. Emotional Facial Expressions: Face Only Emotional Facial Expressions: Face Only Emotional Facial expression Gender Deaf Hearing McCullough et al. (2005) Cognitive Brain Research

  22. Summary: Facial Expression Perception • Greater left STS activation when linguistic facial expressions were in the obligatory verbal context – Left STS may integrate adverbial facial expressions with the manual verb • For signers, activation in STS was bilateral for emotional facial expressions – Emotional facial expression play a linguistic role in narratives and in lexical emotion signs • For signers, neural activity within the fusiform gyrus was left-lateralized – Deaf signers have extensive experience analyzing local facial features (e.g., mouth configuration)

  23. Neural systems underlying spatial language

  24. Classifier constructions: The use of space to represent space HOUSE located here BIKE located here “The bike is near the house.” Emmorey (2003)

  25. Positron Emission Tomography (PET) Studies of Spatial Language Production • Deaf native signers (congenitally deaf) • Hearing native signers (Deaf parents) • Hearing monolingual English speakers University of Iowa: Thomas Grabowski, Hanna Damasio, Richard Hichwa, Laurie Ponto UCSD: Stephen McCullough

  26. Locative classifier Lexical Figure obj Stimuli construction preposition noun “next to” “brush” “in” “paint brush

  27. Locative classifier constructions vs. ASL prepositions R L Left Hemisphere Classifier constructions activated left inferotemporal (IT) cortex Emmorey et al. (2002) NeuroImage

  28. Locative Constructions vs. ASL Nouns Hearing Signers Deaf Signers Left parietal activation Right parietal activation for both groups for both groups Emmorey et al. (2002; 2005) NeuroImage

  29. English Prepositions vs. Nouns Hearing Signers Monolingual Speakers Right parietal activation: only Left parietal activation for ASL-English bilinguals for both groups Emmorey et al. (2005) NeuroImage , H. Damasio et al. (2001) NeuroImage

  30. Summary: Spatial Language • Classifier constructions engage regions in left inferior temporal cortex implicated in the retrieval of names for concrete objects – Handshape encodes information about object type • The production of ASL locative classifier constructions uniquely engages regions within right parietal cortex – Signing space is used to represent spatial relationships • ASL-English bilinguals recruit right parietal cortex when producing English prepositions – Bimodal bilinguals may analyze spatial relationships for encoding in ASL, even when speaking English

  31. Motor-iconicity and the neural systems underlying tool and action naming

Recommend


More recommend