While we tend to characterise silence as the absence of sound, the brain hears it loud and clear, US researchers have found.
According to a recent study from the University of Oregon, some areas of the brain respond solely to sound termination.
Rather than sound stimuli travelling through the same brain pathways from start to finish as previously thought, studies of neurone activity in rats have shown that the onset and offset of sounds take separate routes.
"This is something we see a lot of in the brain: that features which are important for perception are computed and then explicitly represented," says Michael Wehr, lead researcher and psychologist at the University of Oregon's Institute of Neuroscience.
Knowing how the brain responds to, and organises, sounds could lead to better treatment for those who have hearing loss.
Sound information moves through the cochlea and the auditory cortex, the part of the brain responsible for processing sound, as a series of vibrations.
By measuring the frequency of those vibrations before and after exposure to brief noises, Wehr and his team discovered that neurones sort the start and end of sounds through separate channels.
"In the auditory system, information about the onset and offset of a sound is implicitly contained in the firing of neurones close to the sensory receptors, but is explicitly represented by on-responses and off-responses in higher brain areas," says Wehr.
These discrete responses are especially important for language processing.
"Examples are the distinction between 'chop' and 'shop,' or between 'stay' and 'say,'" says Dr Marjorie Leek, a research investigator for the National Center for Rehabilitative Auditory Research.
"In both of these examples, there's a short, transient-like difference either on the beginning of one of the words or within the syllable. Onset and offset responses would be critical to perceiving these cues related to silence."
Although different neurones may respond to sound onsets and offsets, the brain relies on all of them equally to correctly decipher the timing, source and motion of sounds.
"One of the major challenges of the entire ear-brain system is to preserve precise timing information that is ubiquitous in human speech, that supports information about localisation of sound in space, that allows a listener to separate sound sources that are occurring simultaneously, that help to suppress echoes in a highly-reverberant space, and that provide cues to auditory motion," says Leek.
For people with hearing problems, the auditory cortex doesn't properly encode frequencies or temporal cues necessary for understanding and recognising sound information.
Better knowing how the brain organises and groups sounds could lead to more effective hearing therapies and devices, although Wehr recognises that there's still much follow-up research to complete.