Fig. 1. A pedestrian detection scenario demonstrating the data collection in our study. A blind person, wearing smart glasses with our working prototype, and a sighted person walk toward each other in a corridor. The smart glasses detect the sighted passerby, estimate his proximity, and share his relative location and head pose.The spatial behavior of passersby can be critical to blind individuals to initiate interactions, preserve personal space, or practice social distancing during a pandemic. Among other use cases, wearable cameras employing computer vision can be used to extract proxemic signals of others and thus increase access to the spatial behavior of passersby for blind people. Analyzing data collected in a study with blind (N=10) and sighted (N=40) participants, we explore: (i) visual information on approaching passersby captured by a head-worn camera; (ii) pedestrian detection algorithms for extracting proxemic signals such as passerby presence, relative position, distance, and head pose; and (iii) opportunities and limitations of using wearable cameras for helping blind people access proxemics related to nearby people. Our observations and findings provide insights into dyadic behaviors for assistive pedestrian detection and lead to implications for the design of future head-worn cameras and interactions. CCS Concepts: • Human-centered computing → User studies; Empirical studies in HCI; Mobile devices; Empirical studies in accessibility; Accessibility technologies; • Computing methodologies → Computer vision.