In this paper, we propose a new technique for compensating radial and perspective distortions of photos acquired with wide-angle lens by using facial features detected from the images without using predefined calibration patterns. The proposed algorithm utilizes a statistical facial feature model to recover radial distortion and the facial features are further used for adaptive cylindrical projection which will reduce perspective distortion near the image boundary. Our algorithm has several advantages over the traditional methods. First, traditional calibration patterns, like man-made straight buildings, chessboards, or calibration cubes, are not required in our method. Even though the radial distortion can be corrected by several conventional methods, most of them usually produce photos with larger perspective distortion for faces compared to our method. The system is composed of four components: offline training of the statistical facial feature model, feature point extraction from distorted faces, estimation of radial distortion parameters and compensation of radial distortion, and adaptive cylindrical projection. In order to estimate the distortion parameters, we propose an energy considering the fitness between the undistorted coordinates of the facial feature points extracted from the input distorted image and the learned statistical facial feature model. Given the distortion parameters, the fitness is calculated by solving a linear least squares system. The distortion parameters that minimize the cost function are searched in a hierarchical manner. Experimental results demonstrate the distortion reduction in the corrected images by using the proposed method.