Suppose that we have a regression problem with response variable Y ∈ R d and predictor X ∈ R d , for d ≥ 1. In permuted or unlinked regression we have access to separate unordered data on X and Y , as opposed to data on (X, Y )-pairs in usual regression. So far in the literature the case d = 1 has received attention, see e.g., the recent papers by Rigollet and Weed [Information & Inference,8, and Balabdaoui et al. [J. Mach. Learn. Res., 22(172), 1-60]. In this paper, we consider the general multivariate setting with d ≥ 1. We show that the notion of cyclical monotonicity of the regression function is sufficient for identification and estimation in the permuted/unlinked regression model. We study permutation recovery in the permuted regression setting and develop a computationally efficient and easy-to-use algorithm for denoising based on the Kiefer-Wolfowitz [Ann. Math. Statist.,27,[887][888][889][890][891][892][893][894][895][896][897][898][899][900][901][902][903][904][905][906] nonparametric maximum likelihood estimator and techniques from the theory of optimal transport. We provide explicit upper bounds on the associated mean squared denoising error for Gaussian noise. As in previous work on the case d = 1, the permuted/unlinked setting involves slow (logarithmic) rates of convergence rooting in the underlying deconvolution problem. Numerical studies corroborate our theoretical analysis and show that the proposed approach performs at least on par with the methods in the aforementioned prior work in the case d = 1 while achieving substantial reductions in terms of computational complexity.