We explore the hyperparameter space of reservoir computers used for forecasting of the chaotic Lorenz '63 attractor with Bayesian optimization. We use a new measure of reservoir performance, designed to emphasize learning the global climate of the forecasted system rather than short-term prediction. We find that optimizing over this measure more quickly excludes reservoirs that fail to reproduce the climate. The results of optimization are surprising: the optimized parameters often specify a reservoir network with very low connectivity. Inspired by this observation, we explore reservoir designs with even simpler structure, and find well-performing reservoirs that have zero spectral radius and no recurrence. These simple reservoirs provide counterexamples to widely used heuristics in the field, and may be useful for hardware implementations of reservoir computers.Reservoir computers have seen wide use in forecasting physical systems, inferring unmeasured values in systems, and classification. The construction of a reservoir computer is often reduced to a handful of tunable parameters. Choosing the best parameters for the job at hand is a difficult task. We explored this parameter space on the forecasting task with Bayesian optimization using a new measure for reservoir performance that emphasizes global climate reproduction and avoids known problems with the usual measure. We find that even reservoir computers with a very simple construction still perform well at the task of system forecasting. These simple constructions break common rules for reservoir construction and may prove easier to implement in hardware than their more complex variants while still performing as well.