Smart mobile devices are widely used nowadays in daily life activities. Interacting with these devices, however, requires users to use their hands to touch the screen of the device which deprives users with hand disabilities from using them. In this paper, we introduce a Brain-Computer Interface (BCI) system that allows users to interact with smart mobile devices using electroencephalography (EEG) signals. Two applications are introduced for Android-based devices: RunApp and ImgView. RunApp enables users to select and run any installed application on the device while ImgView allows viewing and manipulating images. Both applications rely on the P300 signal paradigm. We utilize a computationally-efficient Principal Component Analysis (PCA) ensemble classifier to identify P300 signals. Results demonstrate the efficacy of the introduced applications using limited training data and limited number of trials with an average online accuracy of 79.17±13.69% for RunApp and 87.5±8.74% for ImgView for 6 different subjects. In addition, we demonstrate the cross-subject generalization capability of the algorithm where we got average online accuracy of 54.17±20.2.4% for RunApp and 55.56±11.38% for ImgView. These results demonstrate the feasibility of running BCI applications on smart mobile devices.