This article presents an intuitive environment for remote micromanipulation composed of both haptic feedback and virtual reconstruction of the scene. To enable nonexpert users to perform complex teleoperated micromanipulation tasks, it is of utmost importance to provide them with information about the 3-D relative positions of the objects and the tools. Haptic feedback is an intuitive way to transmit such information. Since position sensors are not available at this scale, visual feedback is used to derive information about the scene. In this work, three different techniques are implemented, evaluated, and compared to derive the object positions from scanning electron microscope images. The modified correlation matching with generated template algorithm is accurate and provides reliable detection of objects. To track the tool, a marker-based approach is chosen since fast detection is required for stable haptic feedback. Information derived from these algorithms is used to propose an intuitive remote manipulation system that enables users situated in geographically distant sites to benefit from specific equipments, such as SEMs. Stability of the haptic feedback is ensured by the minimization of the delays, the computational efficiency of vision algorithms, and the proper tuning of the haptic coupling. Virtual guides are proposed to avoid any involuntary collisions between the tool and the objects. This approach is validated by a teleoperation involving melamine microspheres with a diameter of less than 2 l m between Paris, France and Oldenburg, Germany.