Research on working memory has followed two largely independent traditions: One concerned with memory for sequentially presented lists of discrete items, and the other with short-term maintenance of simultaneously presented arrays of objects with simple, continuously varying features. Here we present a formal model of working memory, the Interference Model, that explains benchmark findings from both traditions: The shape of the error distribution from continuous reproduction of visual features, and how it is affected by memory set size; the effects of serial position for sequentially presented items, the effect of output position, and the intrusion of nontargets as a function of their distance from the target in space and in time. We apply the model to two experiments combining features of popular paradigms from both traditions: Lists of colors (Experiment 1) or of nonwords (Experiment 2) are presented sequentially and tested through selection of the target from a set of candidates, ordered by their similarity. The core assumptions of the Interference Model are: Contents are encoded into working memory through temporary bindings to contexts that serve as retrieval cues to access the contents. Bindings have limited precision on the context and the content dimension. A subset of the memory set -usually one item and its contextis maintained in a focus of attention with high precision. Successive events in an episode are encoded with decreasing strength, generating a primacy gradient. With each encoded event, automatic updating of working memory reduces the strength of preceding memories, creating a recency gradient and output interference.