Touchless interaction has received considerable attention in recent years with benefit of removing the burden of physical contact. To achieve mid-air interaction, several strategies are available. However, since most of these techniques directly map the 2D WIMP GUI to 3D user interface, they lead unnatural result. In this paper, interaction gestures and tools for exploring volume dataset are designed to perform the similar tasks in the real world. We mainly employ the idea of focus + context based on GPU volume raycasting by trapezoid-shaped transfer function when designing interaction tools. User studies are conducted to demonstrate the usability and intuitiveness of our method. The experimental results show a significant advantage in completion time after a short period of training.