2. Outline
Brief etiology of deafness
Neuropsychological assessment of deaf persons
Developmental and cognitive implications of deafness
Neural and cortical plasticity associated with sensory loss
Neural and cortical plasticity associated with cochlear
implants
Neurobiology of sign language
Social cognitive neuroscience insights from deafness
3. Anatomy and physiology of the ear and
hearing
The pinna and external auditory canal form the outer ear, which is separated
from the middle ear by the tympanic membrane. The middle ear houses three
ossicles, the maleus, incus and stapes and is connected to the back of the
nose by the Eustachian tube. Together they form the sound conducting
mechanism. The inner ear consists of the cochlea which transduces vibration
to a nervous impulse and the vestibular labyrinth which houses the organ of
balance.
4. Etiology of deafness
Type of hearing loss depends on where in the ear the
problem occurs
Three basic types:
Conductive hearing loss
Occurs when sound is not conducted efficiently through the
outer ear canal to the eardrum and the ossicles of the middle
ear
Sensorineural hearing loss (SNHL)
Occurs when there is damage to the inner ear (cochlea), or to
the nerve pathways from the inner ear to the brain
Mixed hearing loss
Sometimes occurs in combination with SNHL, damage to the
outer or middle ear and the inner ear or auditory nerve
5. Age of onset of deafness
Prelingually deaf
95% of all deaf children are prelingually deaf
May be capable of oral communication, but usually develop
these skills much later than they developmentally should
Postlingually deaf
Many retain their ability to use speech and communicate with
others orally
6. Causes of hearing loss in
adults
Osteosclerosis
Meniere’s disease
Autoimmune inner ear disease
Very loud noise
Acoustic neuroma
Physical head injury
Presbycusis
Ototoxic medications
Aminoglycoside antibiotics, salicylates (aspirin) in large quantities, loop
diuretics, drugs used in chemotherapy regimens
7. Hearing loss in
older adults
• Hearing loss is one of the most common
complaints in adults over the age of 60
• Individual differences in hearing ability
predicted the degree of language-driven
neural activity during comprehension
• Linear relationship between hearing ability
and gray matter volume in the primary
auditory cortex
• Declines in auditory ability lead to a
decrease of neural activity during
processing of higher-level speech, and may
contribute to loss of gray matter volume in
the primary auditory cortex
8. fMRI findings
•
A) Regions in which poorer-hearing
listeners showed less language-driven
brain activity
•
B) Overlap of these regions defined
probably primary auditory cortex
•
C) Strongest cortical connectivity is
to the prefrontal cortex, followed by
premotor and temporal cortices
9. Causes of hearing loss in
children
Otitis media: inflammation in the middle ear
Congenital hearing loss
Autosomal dominant hearing loss
Autosomal recessive hearing loss
X-linked hearing loss
Prenatal infections, illnesses, and toxins
Meningitis
Acquired hearing loss
Infections, ototoxic drugs, meningitis, measles, encephalitis, chicken
pox, influenza, mumps, head injury, noise exposure
10. Assessing deaf and hard of
hearing individuals
Communication mode and test administration
Use of interpreters
Selection of appropriate tests and test usage
Demographic factors influencing test
interpretation
11. Impact of deafness on
neuropsychological
performance
30-40% of those who are deaf or hard of hearing have
additional disabilities resulting from the same
condition, disease, or accident that caused the hearing loss
Those with mild-moderate hearing loss are sometimes
overlooked when it comes to other special needs because
it’s assumed that their hearing devices compensate for their
disability
12. Cognitive development in deaf
children
Academic achievement
Reading development
Language development
Performance on standardized intelligence tests
Visual-spatial and memory skills
Conceptual development
Neuropsychological function
13.
14. Cognitive development in deaf
children
Academic achievement
Reading development
Language development
Performance on standardized intelligence tests
Visual-spatial and memory skills
Conceptual development
Neuropsychological function
15. Neural and cortical plasticity
associated with sensory loss
The process of developing a functional auditory system is affected significantly by
sensory deprivation
Sensory deprivation is associated with cross-modal neuroplastic changes in the
brain
Deaf individuals show superior skills in perceptual
tasks
Exact mechanisms of cross-modal plasticity and
neural basis of behavioral compensation are largely
unknown
Not all neuroplastic changes represent behavioral
gains and the restoration of a deprived sense doesn’t
automatically translate to it’s eventual function
16. Cochlear implantation
• Surgically implanted devices providing a sense of sound
• Cochlear implants bypass damage to sensory hair cells in the cochlea by directly
stimulating the auditory nerve and brain
• Candidates have to have severe-profound sensorineural hearing loss in both ears, a
functioning auditory nerve, realistic expectations of results, support of family/friends
17. Neural and cortical plasticity
associated with cochlear
implants
The optimal time to implant a young congenitally deaf child with a unilateral
cochlear implant is within the first 3.5 years of life when the central pathways
show maximal plasticity
Neuronal mechanisms underlying sensitive periods for cochlear implantation
Delays in synaptogenesis
Deficits in higher order cortical development
Cross-modal recruitment
18. Consequences of long-term
auditory deprivation
a) Deaf children who receive a
cochlear implant at 7 y.o. show
abnormal cortical auditory
evoked potentials and a lack of
top-down modulation of
incoming auditory stimuli
b) Long-term deafness beyond
the critical period results in
cross-modal cortical
reorganization
c) Auditory deprivation can
result in deficits in processing
of multimodal stimulation
necessary for language learning
19. Neurobiology of sign language
• Distinctions between spoken language (SpL) and sign language
(SL)
• Neural systems supporting signed and spoken language are very
similar – both involve a predominately left-lateralized
perisylvian network
• The use of space in SL
• The role of the parietal cortex in SL processing
• The role of face and mouth in SL processing
20. Insights into social cognition
Developmental pathways for sociocognitive process are influenced by
“complex interaction effects of early temperament
predispositions, socialization processes, relationship, and culture”
Visual attention
Impulsivity and distractibility
High-level visual processing
Facial expressions
Human actions
Language and communication
Theory of mind
21. References
Alberti, P.W. (2001). The anatomy and physiology of the ear and hearing. Evaluation, prevention, and
control. (pp. 53-62). http://www.who.int/occupational_health/publications/noise2.pdf
Calderon, R. (1998). Learning disability, neuropsychology, and deaf youth: Theory, research, and
practice. Journal of Deaf Studies. 3, 1-3.
Corina, D. & Knapp, H. (2006). Sign language processing and the mirror neuron system. Cortex. 42,
529-539.
Corina, D. & Singleton, J. (2009). Developmental social cognitive neuroscience: Insights from
deafness. Child Development. 80, 952-967.
Hill-Briggs, F., Dial, J.G., Morere, D.A., & Joyce, A. (2007). Neuropsychological assessment of
persons with physical disability, visual impairment or blindness, and hearing impairment or
deafness. Archives of Clinical Neuropsychology. 22, 389-404.
Knapp, H.P. & Corina, D.P. (2010). A human mirror neuron system for language: Perspectives from
signed languages of the deaf. Brain & Language. 112, 36-43.
22. References (cont.)
Kral, A. & Eggermont, J.J. (2007). What’s to lose and what’s to learn: Development under auditory
deprivation, cochlear implants and limits of cortical plasticity. Brain Research Reviews. 56,
259-269.
Kral, A. & Sharma, A. (2011). Developmental neuroplasticity after cochlear implantation. Trends in
Neuroscience. 35, 111-122.
MacSweeney, M., Capek, C.M., Campbell, R. & Woll, B. (2008). The signing brain: The neurobiology of
sign language. Trends in Cognitive Science. 12, 432-440.
Mayberry, R.I. (2002). Cognitive development in deaf children: The interface of language and perception in
neuropsychology. In S.J. Segalowitz & I. Rapin (Eds.), Handbook of Neuropsychology (2nd ed.).
(pp. 71-107). New York, NY: Elsevier Science, Inc.
Merabet, L.B. & Pascual-Leone, A. (2010). Neural reorganization following sensory loss: The opportunity of
change. Nature Reviews Neuroscience. 11, 44-52.
Peelle, J.E., Troiani, V., Grossman, M. & Wingfield, A. (2011). Hearing loss in older adults affects neural
systems supporting speech comprehension. The Journal of Neuroscience. 31, 12638-12643.
Sharma, A., Nash, A.A., & Dorman, M. (2009). Cortical development, plasticity and re-organization in
children with cochlear implants. Journal of Communication Disorders. 42, 272-279.
Editor's Notes
Outer ear: pinna and external auditory canalTympanic membrane separates it from the middle earMiddle ear is made up of the ossicles (maleus, incus, and stapes) and is connected to the back of the nose by the Eustachian tube sound conducting mechanismInner each includes the cochlea (transfers vibration to a nervous impulse) and the vestibular labyrinth (balance)
Conductive hearing loss is sometimes caused by earwax build-up, ear infections, or fluid accumulation in children and treatment depends on the circumstances of the hearing lossThe root cause of sensorineural hearing loss lies in the vestibulocochlear nerve, inner ear, or other processing centers of the brain – mostly due to poorly functioning hair cells in the organ of Corti in the cochlea
Osteosclerosis: disease involving the middle ear that results in conductive hearing loss, often surgically treatableMeniere’s disease: affects the inner ear, results in a combination of sensorineural hearing loss, vertigo, ringing in the ear, and sensitivity to loud soundsAutoimmune inner ear disease: fast onset, can be treated with rapid medical attentionVery loud noise: can cause permanent hearing loss, usually develops gradually and painlesslyAcoustic neuroma: an example of a tumor that causes hearing lossPhysical head injury: can lead to TBI, skull fractures, a hole in the eardrum, or damage to the middle ear structuresPresbycusis: sensorineural hearing loss that occurs later in life gradually, affects both ears, speech begins to sound muffled, rhyming mistakes occur, high-pitched sounds start to go
Had older adults listen to sentences that varied in linguistic demands
Diagrams show regions in which language-driven activity showed a significant correlation with hearing ability
The most common causes of hearing loss in children are heredity and genetics, meningitis, otis media, and noiseOtis media is inflammation in the middle ear associated with the build-up of fluid, which may be infectedGenetic factors are thought to cause more than 50% of all incidents of congenital hearing loss in children
Most significant impact on assessment occurs with severe and profound hearing loss, however any degree of hearing loss can affect functioning and test performanceCommunication mode and test administrationInstructions should be given in the preferred communication mode of the patientIf the patient responds orally, use caution to avoid scoring articulation errorsVisual distractions tend to be more problematic and should be minimizedUse of interpretersNot all signing is ASL; signing skills, dialects, and approaches vary widelyClinician needs to understand the nature of interpretingSelection of appropriate tests and test usageCognitive measures heavily loaded with verbal skills aren’t appropriateBase on demographic variables and an understanding of the task demandsDemographic factors influencing test interpretationEtiology of deafness, presence of neurological or physical comorbiditiesWhether parents are hearing or non-hearingMode of communication used during childhood
Not talking at the right age is one of the first signs that a child cannot hearThis chapter defined cognitive development as not only maturation of the brain, but the product of a child’s attempts to understand the family, neighborhood, school, and the world at large during this critical periodThe effects of deafness on this are diverse and complex and depend on the social context of the childAcademic achievementDeafness itself does not impede a child’s ability to learnWhen age-matched, 15y.o. deaf children in the US are at the 10th grade level in math, however 17-21 y.o. deaf children leaving high school are at the 4th grade level in readingThe primary effect of degree of hearing loss on language development interacts with factors extraneous to deafness (i.e. SES, ethnicity, and additional handicaps)Deaf children with motor/sensory impairments in addition (i.e. poor vision, cerebral palsy) perform less wellConcluding that that the academic achievement of deaf students is predicted by the same factors that predict academic achievement of normally hearing students in the US – social class, ethnic/racial background, and other handicapping conditionsReading developmentOften not considered literate reading level, deafness clearly creates a barrier to reading developmentSpoken languageStudents who achieved in reading shared above average performance on nonverbal IQ tests, parents who were college educated and professionally employed, and special education had begun at or before preschoolSign language debate regarding whether it’s detrimental to reading development (different grammatical structure), historically yes but recent studies have shown positive correlates indicating that it’s not that they can’t speak English, but that the achievement gap results from impoverished language development overallLanguage developmentVery dependent on early childhood exposure to language (whether or not they’re pre or post-lingually deaf)What are the consequences of the gaps in language development for cognitive development?Does the sensory modality of the child’s primary language have any effects on cognitive development?IQ test performanceIs lower performance related to lower IQ?Deaf children show normal performance on non-verbal IQ tests
100 represents an average performance
Visual-spatial and memory skillsSupernormal visual skillsPositive effects of sign language on facial recognition, recognizing movement patterns, generating/rotating mental images – general visuospatial abilities (gender differences and age of sign language acquisition have not been explored)Short-term memoryAuditory experience does influence STM span, but indirectlyDigit span of hearing subjects typically exceeds the digit span of deaf subjects, even when signed or writtenDevelopment of STM is affected by learning sign language/finger-spelling and have more ability to mentally rehearse than those who were delayed in language acquisitionConceptual developmentPast research focused on Piaget’s stages, but more recent work looks at domains of knowledge (understanding of people and the physical and biological worlds)Theory of mind: deaf children’s understanding of other people’s behavior, child needs to learn that other people have desires and beliefs that are different from their own and that those desires/beliefs can explain other’s behaviorLanguage development ends up being the key determinant of success on these tasks, not deafness itselfNeuropsychological functionIn terms of brain organization, there seem to be separate neurocortical effects for sensory compensation and acquiring spatial grammarChildhood deafness produces visual sensory compensation in regions of the cortex normally responsible for visually processing motionStudies have conflicted on hemispheric activation in producing and processing sign language – more specifics on that laterConclusionDeaf children often experience issues with language development (SL or SpL), it’s not caused/does not cause intellectual deficiencies that function independently of languageLanguage difficulty does lead to poor reading poor academic achievementIncomplete language development also affects their theory of mind developmentThe young brain is very plastic, however, which we’ll talk about next
That is, brain areas that are normally associated with the lost sense are recruited by spared sensory modalities (underlying adaptive and compensatory behaviors in deaf individuals)Enhanced tactile sensitivity, better at distinguishing emotional expression and facial features, better at allocating attention during peripheral visual tasks
The cochlea is technically the sense organ – part of the ear is concerned with conducting sound to the cochleaThe cochlea is concerned with transducing vibration, converting the vibrations to nerve impulses which are taken up to the brain to be interpretedPartsOne of more microphones which picks up sound from the environmentSpeech processor which selectively filters sound to prioritize audible speech, splits the sound into channels, and sends the sound through a cable to the transmitterTransmitter is the coil held in place by a magnet behind the external ear and transmits sound signals across the skin to an internal deviceReceiver and stimulator are beneath the skin and convert signals into electrical impulses and sends them through an internal cable to electrodesAn array of up to 22 electrodes wound through the cochlea are what send the impulses to the nerves in the scala tympani and the directly to the brain through the auditory nerve systemWhen implanted in children, many factors influence language development – but most importantly, the cochlear implant needs to stimulate the auditory cortexIf they’re fitted early enough and in language-rich environments, they demonstrate remarkable success in spoken language, later implantation may result in the ability to hear but the inability to detect complex sounds such as spoken language
Neurobiology tells us that certain areas of the cortex will re-organize, if appropriate stimulation is withheld for long periods, therefore, stimulation must happen to a sensory system within a narrow window of time (what are known as critical/sensitive periods) if the system is to develop normallyAfter the sensitive period ends at age 7, there is a high likelihood of de-coupling of the primary auditory cortical areas from surrounding higher-order cortex and cross-modal re-organization of secondary cortical areasDelayed synaptogenesisIn humans with normal hearing, synpatogenesis peaks at 2-4 years of age in the temporal cortex and declines afterwardDeafness results in a delayed synaptogenesis in the auditory cortex suggesting that delays in experience-dependent synaptogenesis must be provided to a child within their sensitive period to accelerate plasticityDeficits in cortico-cortical interactionsThere is still residual plasticity that allowes late-implanted, pre-lingually deaf people to develop speech performance – but they still tend to show poor speech recognition and auditory performance, even after long durations of implantationThis indicates that other factors besides plasticity close down the sensitive period, particularly integrative functions of the auditory system (i.e. reduced sensitivity leading to insufficient representation of features of sound in the brain)The patterns of neuronal activity within the auditory cortex and the interactions between different cortical areas are essential for normal hearing – and though it’s not clear how auditory objects are represented in the brain, we know that representations are generated in the cortex and cortico-cortical interaction is required for encodingCross-modal recruitmentGlobal high-level processing are substantially affected by congenital deafness (ex. deficits in non-auditory functions have been observed like fine motor coordination, working memory, and sustained attention in the visual system)Compensatory (or cross-modal) plasticity in deafness probably occurs as means to improve interaction with the environment in sensory deprivation (ex. improved visual performance during visual location, visual attention, and motion detection)Children who do well with the cochlear implant activate dorsolateral prefrontal networks (higher cognitive functions, reasoning, attentional control, working memory)Children who don’t do well with the cochlear implant show functional specialization of auditory cortical areas for visual processingImage: sign language activates the “language” areas but not the primary auditory cortex
This explains the difficulties of oral language learning experienced by late-implanted childreni. fMRI shows activation in response to visual stimulation in the temporal cortex (including higher-order auditory cortical areas) of deaf adults; ii. MEG (magnetoenecephalograh) images reflect activation of both somatosensory cortex (blue) and higher-order auditory cortex (green, including Wernicke’s area) in response to tactile stimulation in deaf adultsResponses of children with normal hearing suggest auditory dominance and by contrast, responses of children with cochlear implants suggest a greater visual dominance when assessed lip reading “ka” and hearing “pa” can produce “ta”
SpL uses vocal articulators while SL uses actions of the hands, upper torso, and the faceSL uses visual imagery, space, and movement in ways not available to SpLSimilarities reflect neural underpinnings of core language functions, differences are driven by the modality of communication itselfBoth SpL and SL production occurs in the left interior frontal gyrusLeft hemisphere plays a crucial role in SL processingNeuroimaging confirms that there’s an overlap in cortical networks during gesture and SL perceptionThe use of space in SLSpatial processing is usually considered to be right hemisphere dominant, but SL is processed bilaterally as shown in lesion studies raising questions about the specificity of cognitive and linguistic processing domainsThe role of the parietal cortex in SL processingGreater activation in the inferior/superior parietal lobes during SL compared to SpL production and during short-term memory tasks for SL compared to SpLThe role of the face and mouth in SL processingNegated utterance contained manual and nonmanual forms, both left and right hemisphere damaged patients understoodHowever when only facial negation was present, the right hemisphere damaged patients were impaired indicating that facial expressions can be processed bilaterally or predominately in the left hemisphere reflecting reorganization due to the importance of facial expressions in SL
Studies of deafness provide examples of how interactions and contributions of biological predispositions and genetic phenotypes with environmental and cultural factors (i.e. childhood experiences and actions of caregivers) shape developmental trajectoriesThese three categories of research pertaining to deafness are where people who are deaf differ significantly from their hearing counterparts – they’re shaped by sensory deprivation as we just discussed, but they’re alsoVisual attentionCommon claim is that behavior problems of deaf children is related to impulsivity and distractibility – possibly a deficit in visual selective attention stemming from poor multimodal sensory integration as a result of profound sensory deprivation (deficiency hypothesis) or it’s adaptive (looking for alerts) or it’s related to executive functioning issuesHigh-level visual processingFacial expressions are significantly used in ASL to convey meaningThe demands of language processing in the visual-gestural modality may be driving the differential specialization and reorganization within temporal lobe regionsAcquisition of a visual-gestural language appears to place requirements for the development of specialized mechanisms for processing human emotions/actionsLanguage and communicationTheory of mind: the awareness of how mental states such as memories, beliefs, desires, and intentions govern the behaviors of self and others95% of deaf children are raised in hearing households, which means they aren’t exposed to a consistent language model which can lead to delays in language development at critical periods, as such dead children raised in hearing households do not do well on verbal or nonverbal theory of mind tasks