Topic models are often used to identify humaninterpretable topics to help make sense of large document collections. We use knowledge distillation to combine the best attributes of probabilistic topic models and pretrained transformers. Our modular method can be straightforwardly applied with any neural topic model to improve topic quality, which we demonstrate using two models having disparate architectures, obtaining state-of-the-art topic coherence. We show that our adaptable framework not only improves performance in the aggregate over all estimated topics, as is commonly reported, but also in head-to-head comparisons of aligned topics. * Equal contribution. BoW BAT art chess gingerbread modernism painter picasso θ d • B Base neural topic model d Marcel Duchamp was a painter, sculptor, chess player, and writer whose work is associated with Cubism, Dada, and conceptual art. 7 qwone.com/˜jason/20Newsgroups 8 s3.amazonaws.com/research.metamind. io/wikitext/wikitext-103-v1.zip 9 ai.stanford.edu/˜amaas/data/sentiment