1 of 28

Sign Language in the Brain

LING 104 Lecture

2 of 28

Today

  • Short lecture on sign language & the brain
  • Video on sign language in the brain
  • With a sneak peak of exam questions!

  • Vote for your favorite article summary here!
    • Feedback on summaries will be returned in just a few days.
  • Regrade requests will be reviewed by tomorrow.

3 of 28

Common misconceptions (sign language)

  1. Simple gestures that allow rudimentary communication
    • Actually they are true natural languages with elaborate structures and rules, just as complex as spoken languages
  2. One universal sign language
    • Actually, just like spoken language, different areas of the world use different sign languages
    • French Sign Language, Japanese Sign Language, American Sign Language, British Sign Language
  3. A manual version of spoken language
    • Again, not true! Completely different language from dominant spoken language in the region
    • To illustrate: consider American Sign Language and British Sign Language

4 of 28

The structure of signed languages

  • Languages (spoken or signed) are combinatorial systems with hierarchical structure
  • Signed languages have phonological structure
    • Phonemes are basic linguistic units of language; the smallest unit that distinguishes meaning
    • Classified in sign by location, handshape, movement, and orientation

5 of 28

Phonology example

  • Signs for “summer”, “ugly”, and “dry
  • Have the same:
    • Handshape
    • Movement
    • Orientation
  • Differ in:
    • Location

Smallest unit that distinguishes meaning

6 of 28

The structure of signed languages

  • Languages (spoken or signed) are combinatorial systems with hierarchical structure
  • Signed languages have phonological structure
    • Phonemes are basic linguistic units of language; the smallest unit that distinguishes meaning
    • Classified in sign by location, handshape, movement, and orientation
  • Signed languages have syntactic structure (grammar)
    • In English, word order - the order in which we are allowed to put our expressions - specifies grammatical relationships (who is doing what to whom)
    • In ASL, word order can be used, but it’s not required. Instead, one can point to a position in space while signing a noun

7 of 28

Sign language in the brain: Where is it?

Hypothesis 1: Same as spoken language

  • Broca’s and Wernicke’s areas
  • Left lateralized

Hypothesis 2: Different from spoken language

  • Alternate “Broca’s” and “Wernicke’s” areas
  • Right lateralized

8 of 28

Lesion Evidence

9 of 28

Broca’s aphasia

  • Hearing patients with damage to Broca’s area show deficits with the grammar of spoken language.
  • Deaf patients show similar impairments in sign language.
  • Case study: Patient 1
    • Deaf native ASL
    • Difficulty producing signs; struggled to shape, orient, and move her hands
    • Most utterances limited to isolated signs
    • Not just motor issue: Unimpaired at copying line drawings (e.g. elephant, flower)
    • Comprehension of others signing quite good

10 of 28

Wernicke’s aphasia

  • Hearing patients with damage to Wernicke’s area show deficits with lexical-semantic processing
  • Deaf patients show similar impairments in sign language.
  • Case study: Patient 2
    • Deaf fluent signer of ASL; uses all proper grammatical markers
    • Meaning conveyed often incoherent; could not understand signing of others

11 of 28

Right-hemisphere damage

  • Right-hemisphere (RH) is thought to be crucial for visual-spatial processing, so we might guess it would be devastating for sign language processing
  • But signers with RH damage not impaired on production nor comprehension
    • Signing is fluent and accurate, normal grammar, can produce and understand signs with ease
  • Even when nonlinguistic visual-spatial abilities severely impaired
  • Case study: Patient 3
    • Deaf native ASL with RH damage
    • Nonlinguistic visual-spatial ability tested via drawings and found to be impaired (demonstrated left hemineglect typical of RH damage)

12 of 28

Test of visual-spatial ability showing hemineglect

13 of 28

Right-hemisphere damage

  • Right-hemisphere (RH) is thought to be crucial for visual-spatial processing, so we might guess it would be devastating for sign language processing
  • But signers with RH damage not impaired on production nor comprehension
    • Signing is fluent and accurate, normal grammar, can produce and understand signs with ease
  • Even when nonlinguistic visual-spatial abilities severely impaired
  • Case study: Patient 3
    • Deaf native ASL with RH damage
    • Nonlinguistic visual-spatial ability tested via drawings and found to be impaired (demonstrated left hemineglect typical of RH damage)
    • But completely unimpaired on sign language

14 of 28

ERP Evidence

15 of 28

Sign language processing in ASL with ERP

  • Will we see our familiar ERP components in ASL processing?
    • N400 (semantic) and ELAN-P600 (syntactic)

16 of 28

Sign language processing in ASL with ERP

  • Neville et al., 1997
    • 4 groups of ASL exposure: deaf with deaf parents, hearing born to deaf parents, hearing late-learners, hearing non-signers
    • Presented semantically normal and semantically anomalous sentences
    • Results: N400 for semantically anomalous sentences (except in hearing non-signers)

17 of 28

Sign language processing in ASL with ERP

  • Capek et al., 2009
    • Deaf native ASL participants
    • Replicated the N400 effect for semantic violations in ASL
    • Also found the ELAN-P600 for syntactic violations (verb-agreement violations)

18 of 28

Imaging Evidence

19 of 28

Imaging studies report LIFG and STG activity

  • Petitto et al., 2000
    • Two different sign language groups: deaf ASL (American) and deaf LSQ (French Quebec) and hearing controls
    • Generating and viewing meaningful signs
    • Activation found in LIFG and temporal lobe (superior temporal gyrus - STG)
  • Many other studies report similar findings

20 of 28

Imaging studies report LIFG and STG activity

  • Meyer et al, 2007
  • Native signers and hearing non-signers naive to signing
  • Only native signers activate typical language areas when viewing sign language; hearing non-signers activate the visual cortex

21 of 28

Behavioral Evidence

22 of 28

Milestones of acquisition

  • Spoken language acquisition follows a standard trajectory of milestones
    • 0-1 month: Cooing
    • 6 months: Babbling
    • 12 months: First word
    • 12-18 months: One word speech
    • 18-24 months: Two word speech
    • 2-4 years: More complex structures

  • Does language acquisition follow this fixed trajectory, regardless of modality?

23 of 28

Petitto & Marentette

24 of 28

Test case: babbling

In spoken language

(credit: Laura McGarrity)

In signed language

(credit: Marie Coppola)

25 of 28

Continuity between first words and babbling forms

26 of 28

27 of 28

Extra slides

28 of 28

Test case: babbling

  • Babbling occurs in both spoken and signed languages
  • Petitto and Marentette, 1991
    • Compared deaf infants (deaf parents) to hearing infants (hearing parents)
    • 10 to 14 months old
    • Infants in both groups produced a similar amount of hand movements
    • But only deaf infants produced manual babbling movements (ASL phonemes)
  • Petitto et al, 2001
    • Two groups of hearing babies: born to deaf parents and born to hearing parents
    • 6 to 12 months old
    • Same finding as above: only hearing children born to deaf parents showed manual babbling movements