The brain is a remarkable information engine. Its efficiency arises via specialized approaches to the task and a hierarchyVa very non-von-Neumann form. The paper suggests that this computational organization is an architecture of memories of procedures and discusses the mathematical and physical basis for how this approach endows the brain with its efficiency for the different tasks.ABSTRACT | The brain carries out enormously diverse and complex information processing operations to deal with a constantly varying world on a power budget of about 12-20 W.We argue that this efficiency is achieved in part through the dedication of specialized circuit elements and architectures to specific computational tasks, in a hierarchy stretching from the scale of neurons to scale of the entire brain, in sharp contrast to the conventional von Neumann architectures. This paper suggests that the heterogeneous computational repertoires of the brain are architectural memories of efficient computational procedures that are learned via evolutionary selection.