Sound can be used to give orientation, drag the listener's attention into a certain direction and provide navigational cues in virtual, as well as in physical environments. In analogy to the concept of visual pointers we call such sounds Auditory Pointers. While previous work mainly focused on the spacial localization property of sounds, we would like to complement this by using properties of the sound itself. Properties, like loudness, timbre, and pitch, can be used to sonify distance and direction to a target point. In this paper, we describe an exemplary implementation of respective sound synthesis techniques and investigate the effectiveness of different properties in a user study. The findings reveal big differences between the sound parameters and give clues for functional sound design.