An information theoretic characterisation of auditory encoding (22/02/07)
The entropy metric derived from information theory provides a means to quantify the amount of information transmitted in acoustic streams like speech or music. The encoding of such streams in the brain has been proposed to be critically dependent on a 'computational hub' in human auditory cortex, the planum temporale (PT). By systematically varying the entropy of 'fractal' pitch sequences, we tested the prediction of this model that the computational load for such encoding, and therefore the local synaptic activity and energetic demands within PT, increases as a function of entropy in the acoustic signal. In two convergent fMRI studies, we demonstrate this relationship in PT for encoding, while furthermore showing that a distributed fronto-parietal network for retrieval of acoustic information is independent of entropy. The results establish PT as an efficient neural engine that demands fewer computational resources to encode redundant signals than those with high information content.