Abstract-We analyze a k-nearest neighbor (k-NN) class of plug-in estimators for estimating Shannon entropy and Rényi entropy. Based on the statistical properties of k-NN balls, we derive explicit rates for the bias and variance of these plug-in estimators in terms of the sample size, the dimension of the samples and the underlying probability distribution. In addition, we establish a central limit theorem for the plug-in estimator that allows us to specify confidence intervals on the entropy functionals. As an application, we use our theory in anomaly detection problems to specify thresholds for achieving desired false alarm rates.