Personal tools
You are here: Auditory Development Lab > Publications > Music and learning-induced cortical plasticity

Christo Pantev, Bernhard Ross, Takkao Fujioka, Laurel J Trainor, Michael Schulte, and Matthias Schulz (2003)

Music and learning-induced cortical plasticity

Ann N Y Acad Sci, 999:438-50.

Auditory stimuli are encoded by frequency-tuned neurons in the auditory cortex. There are a number of tonotopic maps, indicating that there are multiple representations, as in a mosaic. However, the cortical organization is not fixeddue to the brain's capacity to adapt to current requirements of the environment.Several experiments on cerebral cortical organization in musicians demonstrate an astonishing plasticity. We used the MEG technique in a number of studies to investigate the changes that occur in the human auditory cortex when a skill is acquired, such as when learning to play a musical instrument. We found enlarged cortical representation of tones of the musical scale as compared to pure tones in skilled musicians. Enlargement was correlated with the age at which musiciansbegan to practice. We also investigated cortical representations for notes of different timbre (violin and trumpet) and found that they are enhanced in violinists and trumpeters, preferentially for the timbre of the instrument on which the musician was trained. In recent studies we extended these findings in three ways. First, we show that we can use MEG to measure the effects of relatively short-term laboratory training involving learning to perceive virtualinstead of spectral pitch and that the switch to perceiving virtual pitch is manifested in the gamma band frequency. Second, we show that there is cross-modal plasticity in that when the lips of trumpet players are stimulated (trumpet players assess their auditory performance by monitoring the position and pressure of their lips touching the mouthpiece of their instrument) at the same time as atrumpet tone, activation in the somatosensory cortex is increased more than it is during the sum of the separate lip and trumpet tone stimulation. Third, we show that musicians' automatic encoding and discrimination of pitch contour and interval information in melodies are specifically enhanced compared to those in nonmusicians in that musicians show larger functional mismatch negativity (MMNm)responses to occasional changes in melodic contour or interval, but that the twogroups show similar MMNm responses to changes in the frequency of a pure tone.

automatic medline import

Document Actions