We present a Hopfield-like autoassociative network that stores each memory as two different activity patterns with complementary properties. The first encoding is dense and mutually correlated with a subset of other dense encodings, such that each memory represents an example of a concept. The second encoding is sparse and exhibits no correlation among examples. The network stores each memory as a linear combination of its encodings, which allows for sparse and dense encodings to be retrieved at high and low activity threshold, respectively. At low threshold, as the number of examples stored increases, the retrieved activity shifts from representing densely encoded examples to densely encoded concepts, which are built from accumulating common example features. Meanwhile, at high threshold, the network can still distinctly retrieve many sparsely encoded examples due to the high capacity of sparse, decorrelated patterns. Thus, we demonstrate that a simple autoassociative network with a Hebbian learning rule can retrieve memories at two scales. It can also perform heteroassociation between them, such that one encoding of a memory can be used as a cue to retrieve another. We obtain our results by deriving macroscopic mean-field equations for this network, which allows us to calculate capacity formulas for sparse examples, dense examples, and dense concepts. We also perform network simulations, which verify our theoretical results and explicitly demonstrate the capabilities of our model.