000 02231nab a2200265 4500
003 OSt
005 20220803120524.0
007 cr aa aaaaa
008 220722b |||||||| |||| 00| 0 eng d
100 _aRichardson, Michael
_949779
245 _aHow Much Spatial Information Is Lost in the Sensory Substitution Process? Comparing Visual, Tactile, and Auditory Approaches/
260 _bsage
_c2019
300 _aVol: 48, issue: 11, 2019: (1079-1103 p.).
520 _aSensory substitution devices (SSDs) can convey visuospatial information through spatialised auditory or tactile stimulation using wearable technology. However, the level of information loss associated with this transformation is unknown. In this study, novice users discriminated the location of two objects at 1.2 m using devices that transformed a 16 × 8-depth map into spatially distributed patterns of light, sound, or touch on the abdomen. Results showed that through active sensing, participants could discriminate the vertical position of objects to a visual angle of 1°, 14°, and 21°, and their distance to 2 cm, 8 cm, and 29 cm using these visual, auditory, and haptic SSDs, respectively. Visual SSDs significantly outperformed auditory and tactile SSDs on vertical localisation, whereas for depth perception, all devices significantly differed from one another (visual > auditory > haptic). Our findings highlight the high level of acuity possible for SSDs even with low spatial resolutions (e.g., 16 × 8) and quantify the level of information loss attributable to this transformation for the SSD user. Finally, we discuss ways of closing this “modality gap” found in SSDs and conclude that this process is best benchmarked against performance with SSDs that return to their primary modality (e.g., visuospatial into visual).
650 _asensory substitution,
_949780
650 _aspatial perception,
_949781
650 _ahearing,
_949782
650 _a touch,
_948945
650 _avision
_949315
700 _aThar, Jan
_949783
700 _aAlvarez, James
_949784
773 0 _012374
_916462
_dSage,
_tPerception
_x1468-4233
856 _uhttps://doi.org/10.1177/0301006619873194
942 _2ddc
_cART
999 _c12513
_d12513