Policy decisions have vast consequences, but there is little empirical research on how best to communicate underlying evidence to decision makers. Groups in diverse fields (e.g., education, medicine, crime) use brief, graphical displays to list policy options, expected outcomes, and evidence quality, to make such evidence easy to assess. However, understanding of these representations is rarely studied. We surveyed experts and non-experts on what information they want and tested their objective comprehension of commonly used graphics. 252 UK residents from Prolific and 452 UK What Works Centre users interpreted the meaning of graphics shown without labels. Comprehension was low (often below 50 per cent). The best-performing graphics used unambiguous, relevant shapes, color cues, and indications of quantity. The participants also reported what types of evidence they wanted and in what detail (e.g., subgroups; different outcomes). Intervention effectiveness and quality were universally paramount, and policy makers also wanted to know the financial costs and other negative consequences. Comprehension and preferences were remarkably consistent between the samples. Groups communicating evidence about policy options can use these results to design summaries, toolkits, and reports for expert and non-expert audiences.