Ultrasound haptics is a contactless haptic technology that enables novel mid-air interactions with rich multisensory feedback. This paper surveys recent advances in ultrasound haptic technology. We discuss the fundamentals of this haptic technology, how a variety of perceptible sensations are rendered, and how it is currently being used to enable novel interaction techniques. We summarize its strengths, weaknesses, and potential applications across various domains. We conclude with our perspective on key directions for this promising haptic technology.
There may be differences between this version and the published version. You are advised to consult the publisher's version if you wish to cite from it.
Fig. 1. Our novel propagation workflow makes it easy to propagate visual designs to numerous datasets. Reference visualizations are created for data streams, which are associated with several keywords in our ontology. A search and activate process is used to propagate the reference visualisation to other appropriate data streams. (1) Ontology keywords are used to construct a query in our search UI for suitable data stream combinations. (2) Search results consist of ranked data stream combinations that match query parameters, although some results may not be suitable for propagation. (3) A quality assurance step carried out by an expert ensures the visual design is only propagated to suitable data, resulting in new visualizations that are immediately deployed as web pages.
Above-device gesture interfaces let people interact in the space above mobile devices using hand and finger movements. For example, users could gesture over a mobile phone or wearable without having to use the touchscreen. We look at how above-device interfaces can also give feedback in the space over the device. Recent haptic and wearable technologies give new ways to provide tactile feedback while gesturing, letting touchless gesture interfaces give touch feedback. In this paper we take a first detailed look at how tactile feedback can be given during above-device interaction. We compare approaches for giving feedback (ultrasound haptics, wearables and direct feedback) and also look at feedback design. Our findings show that tactile feedback can enhance above-device gesture interfaces.
Acoustic levitation enables a radical new type of humancomputer interface composed of small levitating objects. For the first time, we investigate the selection of such objects, an important part of interaction with a levitating object display. We present Point-and-Shake, a mid-air pointing interaction for selecting levitating objects, with feedback given through object movement. We describe the implementation of this technique and present two user studies that evaluate it. The first study found that users could accurately (96%) and quickly (4.1s) select objects by pointing at them. The second study found that users were able to accurately (95%) and quickly (3s) select occluded objects. These results show that Pointand-Shake is an effective way of initiating interaction with levitating object displays.
When users want to interact with an in-air gesture system, they must first address it. This involves finding where to gesture so that their actions can be sensed, and how to direct their input towards that system so that they do not also affect others or cause unwanted effects. This is an important problem [6] which lacks a practical solution. We present an interaction technique which uses multimodal feedback to help users address in-air gesture systems. The feedback tells them how ("do that") and where ("there") to gesture, using light, audio and tactile displays. By doing that there, users can direct their input to the system they wish to interact with, in a place where their gestures can be sensed. We discuss the design of our technique and three experiments investigating its use, finding that users can "do that" well (93.2%-99.9%) while accurately (51mm-80mm) and quickly (3.7s) finding "there".
We demonstrate a technique for rendering textured haptic surfaces in mid-air, using an ultrasound haptic display. Our technique renders tessellated 3D 'haptic' shapes with di erent waveform properties, creating surfaces with distinct perceptions.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.