Traditional myoelectric controls of prostheses for transhumeral amputees fail to provide intuitive coordination of the necessary degrees of freedom. Building upon promising advances in movement-based controls and computer vision, we have previously demonstrated that reconstructing the distal joints based on Artificial Neural Network (ANN) predictions, while knowing the shoulder posture and the movement goal (i.e., position and orientation of the targeted object), enables participants to position and orient an avatar hand to grasp objects scattered throughout a wide workspace with performances comparable to that of a natural arm. However, this previous control involved rapid and unintended prosthesis movements that resulted from sudden changes in the ANN predictions at each modification of the movement goal, rendering its use impractical for real-life scenarios. Here, we designed and tested a novel approach to eliminate this abrupt change, based on an angular trajectory, determined from the speed of stump movement and the gap remaining between the current and the 'goal' distal configurations. Two methods are presented to define this 'goal' configuration, either relying solely on the movement goal or also taking into account the current shoulder posture. Despite a slight increase in movement time, new controls allowed twelve nondisabled and six with transhumeral limb deficiency participants to reach objects at various positions and orientations in a virtual reality set-up without prior training. The good performances achieved with those controls, particularly when the current shoulder posture was accounted for, represent a necessary step towards applications in real-world scenarios.