The family of natural evolution strategies (NES) offers a principled approach to real-valued evolutionary optimization by following the natural gradient of the expected fitness on the parameters of its search distribution. While general in its formulation, existing research has focused only on multivariate Gaussian search distributions. We address this shortcoming by exhibiting problem classes for which other search distributions are more appropriate, and then derive the corresponding NES-variants.First, we show how simplifying NES to separable distributions reduces its complexity from O(d 3 ) to O(d), and apply it to problems of previously unattainable dimensionality, recovering lowest-energy structures on the Lennard-Jones atom clusters and state-of-the-art results on neuro-evolution benchmarks. Second, we develop a new, equivalent formulation based on invariances, which allows us to generalize NES to heavy-tailed distributions, even if their variance is undefined. We then investigate how this variant aids in overcoming deceptive local optima.