We present a set of cross-dimensional interaction techniques for a hybrid user interface that integrates existing 2D and 3D visualization and interaction devices. Our approach is built around oneand two-handed gestures that support the seamless transition of data between co-located 2D and 3D contexts. Our testbed environment combines a 2D multi-user, multi-touch, projection surface with 3D head-tracked, see-through, head-worn displays and 3D tracked gloves to form a multi-display augmented reality. We address some of the ways in which we can interact with private data in a collaborative, heterogeneous workspace. We also report on a pilot usability study to evaluate the effectiveness and ease of use of the cross-dimensional interactions.