What you hear is what you see: Audio quality from Image Quality Metrics

Tashi Namgyal; Alexander Hepburn; Raul Santos-Rodriguez; Valero Laparra; Jesus Malo
DAFx-2023 - Copenhagen
In this study, we investigate the feasibility of utilizing stateof-the-art perceptual image metrics for evaluating audio signals by representing them as spectrograms. The encouraging outcome of the proposed approach is based on the similarity between the neural mechanisms in the auditory and visual pathways. Furthermore, we customise one of the metrics which has a psychoacoustically plausible architecture to account for the peculiarities of sound signals. We evaluate the effectiveness of our proposed metric and several baseline metrics using a music dataset, with promising results in terms of the correlation between the metrics and the perceived quality of audio as rated by human evaluators.