Over the past decades, text-analysis methods have been slowly integrated into the toolbox of methods used to reliably measure psychological constructs. Yet, many of the existing computational methods in psychological text analysis remain atheoretical and lack the interpretability that social sciences are accustomed to and desire. Here, we introduce a novel method for theory-driven text analysis by bridging the power of contextual language models and common psychometric scales. The new technique, which we call Contextualized Construct Representation (CCR), retains high levels of interpretability and top-down flexibility, but makes use of state-of-the-art language models developed in natural language processing (NLP). CCR is a flexible technique that will be able to adapt to the continuously progressing set of tools for language modeling. We discuss how our proposed technique quantifies psychological information in textual data, and demonstrate in two studies (N = 2,996) that CCR outperforms other top-down methods (i.e., word-counting and word-embedding representations) in predicting an array of psychological outcomes common in social and personality psychology, including moral values, the need for cognition, political ideology, strength of norms, and cultural orientation. We provide an accompanying R package, Python library, and develop an interface for researchers to conveniently use CCR in their research.