The sense of hearing is unique of all the senses and motor pathways of the body in that it doesn't cross over to project primarily to brain areas on the opposite side. This does not mean it doesn't "decussate" or divide and send projections to the opposite side of the brain – in fact, these crossover projections are essential to the localizing and tracking sound sources.
The diagram at right shows the stages in the auditory processing pathway from cochlea to auditory cortex. In computer/electronic terms, the process described in yesterday's blog essentially acts as an analog to digital converter. The analog sound frequency waveform is turned into a parallel digital signal by activating the hair cells at particular points along the basilar membrane, thus separating the sound waveform into the essentially "digital" detection of discrete frequencies. Neurons attached to the hair cells project to the cochlear nucleus, and from there to the superior olive and inferior colliculus of the brainstem. It is at this level that most of the crossover of signals occurs – but not to project the sensation to the opposite side of the brain! No, the decussation of neuron projections in the brainstem is important for comparison of different signals between the ears. Differences in sound intensity as well as differences in *phase* are used to determine whether a sound occurred on the left or right.
Most people are familiar with the Doppler Effect for sound. A sound coming toward you seems higher pitched, because the sound waves are compressed by motion of the sound source, creating a slight increase in pitch. As the source moves away from you, the sound waves get longer, and the pitch drops. The outer ear, or "pinna" is shaped to funnel sound – but it also introduces minor differences in *phase*. Phase is similar to the Doppler Effect, but is more of an effect of the *distance* that sound travels to reach the ear. The best example of a phase difference is to have a person in the same room with you call you on the phone. The voice you hear through the phone is delayed compared to the voice you hear directly via the ear. The delay is due to the electrical relays and the *distance* that the signal must travel. A sound source to your right reaches the right ear sooner than it does the left ear. The pitch – therefore the waveform – is exactly the same, but the *phase* is different due to the additional time travel past the head to the opposite ear. Likewise the volume of the sound is slightly louder in the right ear.
The process of comparing the sounds received at each ear, what we call "binaural comparison" is necessary to determine to direction from which a sound originates. However, we live in a 3-D world, left or right is not enough. Again, the shape of the pinna, with the auditory canal at the bottom, and the folds at the back, further deflect the trajectory of sound waves, introducing differences that can be used to detect up-down and front-back. Of course, to further localize the source of sound, we can always move the head. In that case, we need to include the additional information of which direction the head is pointing, requiring the thalamic relays and somatosensory feedback in the parietal lobe. To further aid our search, we can add visual information from V4 and frontal eye fields and point our eyes at the suspected source of sound for positive identification. Full auditory localization requires a number of different pathways and integrates information from all over the brain, but the initial processing is done in the brainstem and doesn't even require conscious attention!
The final relay from thalamus to cortex presents information organized by pitch (frequency), intensity, location, and "tone" which forms a map not only of sound quality, but also that sound's location in space in a manner similar to, but not as detailed, as the visual map in area V1. Like the visual system, there is a lot of preprocessing of information in the brainstem and subcortical areas; and again, like the visual system – damage to the primary cortical areas may leave the subject unaware that the stimulus was received, but still be able to react (with a "startle" response) to the presence of loud or sudden noises.
It's a fascinating process, and this description *still* does not address the more complex modes of auditory integration. The primary auditory cortex receives information about pitch and loudness, but the elements that comprise "music" are properties of the secondary (association) auditory cortex – harmony, melody, rhythm. Further integration of sound into "music," "speech," and natural sounds occurs in the tertiary auditory cortex and association areas located in... the parietal lobe adjacent to the visual association areas. The involvement of these areas in speech, language as well as integration with vision for tracking, reading, etc will be explored in the next edition of this blog.