Abstract
Hearing-impaired (HI) listeners often show less masking release (MR) than normal-hearing listeners when temporal fluctuations are imposed on a steady-state masker, even when accounting for overall audibility differences. This difference may be related to a loss of cochlear compression in HI listeners. Behavioral estimates of compression, using temporal masking curves (TMCs), were compared with MR for band-limited (500-4000 Hz) speech and pure tones in HI listeners and age-matched, noise-masked normal-hearing (NMNH) listeners. Compression and pure-tone MR estimates were made at 500, 1500, and 4000 Hz. The amount of MR was defined as the difference in performance between steady-state and 10-Hz square-wave-gated speech-shaped noise. In addition, temporal resolution was estimated from the slope of the off-frequency TMC. No significant relationship was found between estimated cochlear compression and MR for either speech or pure tones. NMNH listeners had significantly steeper off-frequency temporal masking recovery slopes than did HI listeners, and a small but significant correlation was observed between poorer temporal resolution and reduced MR for speech. The results suggest either that the effects of hearing impairment on MR are not determined primarily by changes in peripheral compression, or that the TMC does not provide a sufficiently reliable measure of cochlear compression.
Original language | English (US) |
---|---|
Pages (from-to) | 2895-2912 |
Number of pages | 18 |
Journal | Journal of the Acoustical Society of America |
Volume | 134 |
Issue number | 4 |
DOIs | |
State | Published - 2013 |
Bibliographical note
Funding Information:This work was supported by NIH grants R01 DC03909 (AJO) and R01 DC008306 (PBN). The associate editor, Enrique Lopez-Poveda, provided helpful comments on an earlier version of this paper.