The argument that non-verbal modalities play a major role in accounts of human communication is not new. Insights from early pragmatics into the significance of paralinguistic behaviours in interaction certainly gives us a taste of the non-verbal dominance view of communication. Whilst Birdwhistell (1970, p. 158) reports that 'probably no more than 30 to 35 percent of the social meaning of a conversation or an interaction is carried by the words', Mehrabian's 7%-38%-55% rule (1971) stipulates that 55 per cent of the information that we convey in a given exchange is communicated via our body language, 38 per cent via our tone of voice or how we say what we say and seven per cent from what is said (i.e. the words themselves). The importance of acknowledging the role of paralinguistic phenomena in face-to-face interaction is articulated by Abercrombie (1968, p. 65): 'The conversational use of spoken language cannot be properly understood unless paralinguistic elements are taken into account. ' While Stevick (1982, p. 163) describes non-verbal communication as providing 'the surface on which the words are written and against which they must be interpreted'. In this paper, we view multimodality as close to Claudel's notion of L'oeil ecoute (1946;Madella, 2021), encompassing both visible and audible modes of communication and interpretation, which are to 'converge to contribute to a final meaning or message of which they are an intrinsic part.' (Riley, 1979, p. 143). 'Multimodal' inputswhat Adam Kendon refers to as 'poly-modalic' (2014a, p. 67)invite the eyes to be as alert as the ears. Linguistic communication is, beyond any reasonable doubt, multimodal. Pragmatic accounts of linguistic communication must be multimodal also.