Lateral predictive coding is a recurrent neural network which creates energy-efficient internal representations by exploiting statistical regularity in external sensory inputs. Here we investigate the trade-off between energy minimization and information robustness in a linear model of lateral predictive coding. We observe phase transitions in the synaptic weight matrix at several critical trade-off temperatures, especially a continuous transition which breaks permutation symmetry and builds cyclic dominance, and a discontinuous transition with the associated sudden emergence of tight competition between excitatory and inhibitory interactions. Self-organized hierarchical structures also form in the network at low temperatures. Our results indicate the structural complexity of lateral predictive coding networks.