Hyper basis function neural networks (HBFNNs) have gained considerable attention in recent years, which have shown good performance in a variety of application domains. In this paper, we first briefly introduce the development of neural networks. Then the structure of HBFNNs is presented in detail. HBFNNs are an extension of radial basis function neural networks (RBFNNs), which use the weighted norm instead of the Euclidean norm to represent the distance from input data to hidden layer neuron centres. With this change, the generalization ability of neural networks becomes stronger than that of RBFNNs. Subsequently, we summarize several commonly used training methods for HBFNNs, including static training methods and dynamic training methods. Finally, we give several typical application fields of HBFNNs.