Recent studies suggest that artificial auditory feedback, called movement sonification, can function as sensory substitution in motor control, replacing or augmenting vision and proprioception. To a first approximation, studies either apply (1) natural sonifications (exploiting ecological spatial encodings or action-sound associations) of kinematic variables, e.g. velocity, to high (≥3) degree of freedom movements, or (2) nonnatural, i.e. unfamiliar, sonifications including spatial position to low degree of freedom movements. To the best of our knowledge, no one has shown that unfamiliar sonifications of spatial position can be used to guide high degree of freedom movements. We reasoned that an ultra-responsive (1-2ms latency, 1000Hz sampling) sonification compacting 3D spatial information into a low (<3) number of acoustic dimensions would enable spatial guidance of a high degree of freedom movement. We constructed such a movement sonification system using a bespoke digital sound synthesis technique (two-timer pulse-width modulation), real-time time-warping algorithm, and unaligned quaternion representations of body limb rotation. We used the system to sonify unconstrained reaches in space in a way that indicated invisible targets. We validated the system with hardware benchmarking and validated the approach by showing that users were able to reach for targets presented in sonification with approximately half the error of either (a) reaching while listening to qualitatively similar auditory feedback lacking spatial information, or (b) reaching randomly while listening to a constant tone.