When reading a sentence, individual words can be combined to create more complex meaning. In this study, we sought to uncover brain regions that reflect the representation of meaning at the sentence level, as opposed to only the meaning of their individual constituent words. Using fMRI, we recorded the neural activity of participants while reading sentences. We constructed sentence topic-level representations using the final layer of a convolutional neural network (CNN) trained to classify Wikipedia sentences into broad semantic categories. This model was contrasted with word-level sentence representations constructed using the average of the word embeddings constituting the sentence. Using representational similarity analysis, we found that the medial prefrontal cortex, lateral anterior temporal lobe, precuneus, and angular gyrus more strongly represent sentence topic-level, compared to word-level, meaning, uncovering the important role of these semantic system regions in the representation of integrated meaning. Conversely, these results validate the capacity of CNNs to capture human sentence-level representations.