Figure 1: Different interaction landscapes representing the interactions of a motion driver with a static object. We capture the motion trajectories (red) and encode their signatures into a descriptor that can be used for comparing interactions. From left to right: a cloth simulation interacting with a support structure, a human model walking on a floor, a wind simulation interacting with a car, and a robotic hand grasping a cup.
AbstractInteractions play a key role in understanding objects and scenes, for both virtual and real world agents. We introduce a new general representation for proximal interactions among physical objects that is agnostic to the type of objects or interaction involved. The representation is based on tracking particles on one of the participating objects and then observing them with sensors appropriately placed in the interaction volume or on the interaction surfaces. We show how to factorize these interaction descriptors and project them into a particular participating object so as to obtain a new functional descriptor for that object, its interaction landscape, capturing its observed use in a spatio-temporal framework. Interaction landscapes are independent of the particular interaction and capture subtle dynamic effects in how objects move and behave when in functional use. Our method relates objects based on their function, establishes correspondences between shapes based on functional key points and regions, and retrieves peer and partner objects with respect to an interaction.