Abstract-This work documents our progress on building an unmanned aerial vehicle capable of autonomously mapping urban environments. This includes localization and tracking of the vehicle's pose, fusion of sensor-data from onboard GNSS receivers, IMUs, laserscanners and cameras as well as realtime path-planning and collision-avoidance. Currently, we focus on a physics-based approach to computing waypoints, which are subsequently used to steer the platform in three-dimensional space. Generation of efficient sensor trajectories for maximized information gain operates directly on unorganized point clouds, creating a perfect fit for environment mapping with commonly used LIDAR sensors and time-of-flight cameras. We present the algorithm's application to real sensor-data and analyze its performance in a virtual outdoor scenario.