Characterizing what makes a packet reordering metric meaningful is a problem that has attracted significant interest, but it still lacks a universally accepted solution. We contribute to this discussion by investigating some theoretical concepts that make the following simple intuitions precise:-A metric that is inconsistent, i.e., gives different values on two similar TCP traces, should not be regarded as useful.-We formalize the notion of two traces being "identical modulo unimportant details" using similarity relations.-If "real-life" traces differ from random sequences by always satisfying certain reorder invariants, then we should only use traces satisfying these invariants when investigating the consistency of a reordering metric.We illustrate these concepts in the context of Restored, an approach to semantic compression of TCP traces [10]. In particular, we discuss the consistency of two metrics defined by Jayasumana et al. [1,12] with respect to the similarity notions defined in [8,9,10].