We discuss the frequency of desynchronization events in power grids for realistic data input. We focus on the role of time correlations in the fluctuating power production and propose a new method for implementing colored noise that reproduces non-Gaussian data. We extend and propose different methods of dimensional reduction to considerably reduce the high-dimensional phase space and to predict the rare desynchronization events with reasonable computational costs. The first method splits the system into two areas, connected by heavily loaded lines, and treats each area as a single node. The second method completely disregards the dynamics of the phase angles and considers power fluctuations only, treating any desynchronization event as an overload. The fact that the latter is accurate, albeit only to exponential accuracy in the strength of fluctuations, means that the number of rare events does not sensitively depend on inertia or damping for realistic heterogeneous parameters. Neither does its number automatically increase with non-Gaussian fluctuations in the power production as one might have expected. On the other hand, the analytical expressions for the average time to desynchronization depend sensitively on the finite correlation time of the fluctuating power input.