Reservoir Computing is a new paradigm in artificial recurrent neural network training. A reservoir is generated randomly and only a readout layer is training [1]. Its simplicity and ease of use, paired with its underlying computational power make it an ideal choice for many application domains, for example time-series prediction, speech recognition, noise modeling, dynamic pattern classification, reinforcement learning and language modeling. However it is necessary to adjust the parameters and the topology to create a "good" reservoir for a given application. This paper presents an original investigation of an evolutionary method for simultaneous optimization of parameters, topology and reservoir weights in Echo State Networks. Optimizing reservoirs is a challenge and several evolutionary strategies for optimizing reservoirs have been presented, generally using the idea of separating the topology and reservoir weights to reduce the search space [1]. Here we present a method to optimize everything in concert. The results of this method applied to two different time series are shown and conferred with previous works.