“…Proximity interaction techniques can take advantage of pointing gestures to intuitively express locations or objects with minimal cognitive overhead; this modality has been often used in HRI research e.g. for pick‐and‐place tasks (Brooks & Breazeal, ; Cosgun, Trevor, & Christensen, ; Droeschel, Stückler, & Behnke, ; Großmann et al, ), labeling and/or querying information about objects or locations (Akkil & Isokoski, ; Brooks & Breazeal, ; Pateraki, Baltzakis, & Trahanias, ), selecting a robot within a group (Nagi, Giusti, Gambardella, & Di Caro, ; Pourmehr, Monajjemi, Wawerla, Vaughan, & Mori, ), and providing navigational goals (Raza Abidi, Williams, & Johnston, ; Gromov et al, , ; Jevtić, Doisy, Parmet, & Edan, ; Tölgyessy et al, ; Van den Bergh et al, ; Wolf et al, ). Such gestures can enable rescue workers to easily direct multiple robots, and robot types, using the same interface (see Figure ).…”