This paper presents a human-robot interaction (HRI) study of a dedicated Mission Specialist interface for performing telemanipulation tasks using a small unpiloted aerial vehicle (UAV). Current literature suggests that the successful completion of aerial manipulation tasks in real-world environments requires human input due to challenges in autonomous perception and control. Visual information of the remote environment in a telemanipulation interface can significantly affect performance under direct control; however, the effects of interface visualizations on task performance have not been studied for UAV telemanipulation. This work evaluated the effects of interface viewpoint on aerial manipulation task performance. The interface evaluated in this study included video streams from cameras located onboard the UAV, including: i) a manipulator egocentric view, ii) a manipulator exocentric view, and iii) a combination of egocentric and exocentric views. A total of 36 participants completed three different manipulation tasks using all three interface conditions. The observations and results showed that both the exocentric and mixed configurations contributed to improved task performance over an egocentric-