“…Using the EyeCane’s distance information together with whole scene SSDs (Auvray, Hanneton, & O’Regan, 2007; Meijer, 1992), which specifically can convey visual information via audition or touch (indeed, by some definitions the EyeCane would be classified as a “minimalistic-sensory-substitution-device”) such as the EyeMusic (Abboud, Hanassy, Levy-Tzedek, Maidenbaum, & Amedi, 2014; Levy-Tzedek, Hanassy, Abboud, Maidenbaum, & Amedi, 2012), vOICe (Meijer, 1992) or BrainPort (Bach-y-Rita & W. Kercel, 2003) (see review on SSDs (Proulx et al, 2015)), will potentially enable users to integrate this information and further understand their environment. For example, both recognizing and understanding the distance to an object of interest, or understanding and recognizing obstacles as they are approached (see (Reiner, 2008) for additional potential benefits of such pairing).…”