Laser range finder and omnidirectional cameras are becoming a promising combination of sensors to extract rich environmental information. This information includes textured plane extraction, vanishing points, catadioptric projection of vertical and horizontal lines, or invariant image features. However, many indoor scenes do not have enough texture information to describe the environment. In these situations, vertical edges could be used instead. This study presents a sensor model that is able to extract three-dimensional position of vertical edges from a range-augmented omnidirectional vision sensor. Using the unified spherical model for central catadioptric sensors and the proposed sensor model, the vertical edges are locally projected, improving the data association for mapping and localisation. The proposed sensor model was tested using the FastSLAM algorithm to solve the simultaneous localisation and mapping problem in indoor environments. Real-world qualitative and quantitative experiments are presented to validate the proposed approach using a Pioneer-3DX mobile robot equipped with a URG-04LX laser range finder and an omnidirectional camera with parabolic mirrorThis work has been partially supported by the project RAIMON - Autonomous Underwater Robot for Marine Fish Farms Inspection and Monitoring (Ref. CTM2011-29691-C02-02)funded by the Spanish Ministry of Science and Innovation, the LASPAU-COLCIENCIAS grant no. 136-2008, the University of Valle contract no. 644-19-04-95, and the consolidated research group's grant no. SGR2009-0038