Recent advances in mobile computing and augmented reality (AR) technology have lead to popularization of mobile AR applications. Touch screen interfaces are common in mobile devices, and are also widely used in AR applications running on mobile devices, such as smartphones. However, due to unsteady camera view movement in handheld AR environment, it is hard to carry out precise interactions, such as drawing, especially when tracing physical objects. In this paper, we investigate two types of interaction techniques, Freeze-Set-Go and Snap-To-Feature, that help users to perform more accurate touch screen based AR interactions. The two techniques are compared in a user experiment with a task of tracing physical objects, which can be encountered when making annotation on or modeling physical objects within the AR scene. The results from the experiment show that a combination of these two makes a significant difference in accuracy and usability of touch screen based AR interaction.