Virtual acoustic space (VAS) is an R+D multidisciplinary project on sensory substitution designed to study the human auditory capability of perceiving complex spatial patterns using a sound code. The aim is to improve blind people’s mobility and orientation. A device has been developed that transduces on-line visual scene information captured by a pair of microcameras into a specially processed sound delivered through headphones. It is perceived by the subject as multiple irregularly repetitive clicks coming from the coordinates occupied by the surfaces of objects in that scene, generating a unitary perception of every object’s whole surface extending over the relevant coordinates in the field of view (González Mora et al. 1999).
Our results support the previous findings, particularly in congenital blindness (Weeks et al. 2000), of an involvement of occipital areas in cross-modal auditory processing, and suggest that a similar response can be found in late onset blindness, strengthening the hypothesis that the visual cortex remains functional in peripheral blindness, possibly as a result of plastic reorganization as well as by functional recruitment.
Current studies are directed to further verification of the involvement of parieto-occipital areas in perceiving the surrounding acoustic stimulus both in early and late onset blindness.
We thank all the participating volunteers. This work is supported by grants from the Spanish Science and Technology Ministry and from Europeans Founds (FEDER), ref. 1FD1997-1237.