Multiwavelength radiation from relativistic particles accelerated at shocks in novae and other astrophysical sources carries a wealth of information about the outflow properties and the microphysical processes at work near the shocks. The observation of GeV gamma-rays from novae by Fermi/LAT demonstrates that the shocks in these systems can accelerate particles to energies of at least ∼ 10 GeV. The low-energy extension of the same non-thermal particle distribution inevitably gives rise to emission extending into the X-ray band. Above 10 keV this radiation can escape the system without significant absorption/attenuation, and can potentially be detected by NuSTAR. We present theoretical models for hard X-ray and gamma-ray emission from radiative shocks in both leptonic and hadronic scenarios, accounting for the rapid evolution of the downstream properties due to the fast cooling of thermal plasma. Due to strong Coulomb cooling of the mildly relativistic electrons nominally responsible for producing hard X-ray emission, only a fraction of 10 −4 − 10 −3 of the gamma-ray luminosity is radiated in the NuSTAR band; nevertheless, this emission could be detectable simultaneous with the LAT emission in bright gamma-ray novae with a ∼ 50 ks exposure. The spectral slope in hard X-rays is α ≈ 0 for typical nova parameters, thus serving as a testable prediction of the model. Our work demonstrates how combined hard X-ray and gamma-ray observations can be used to constrain properties of the nova outflow (velocity, density and mass outflow rate) and particle acceleration at the shock. A very low X-ray to gamma-ray luminosity ratio (L X /L γ 5 × 10 −4 ) would disfavor leptonic models for the gamma-ray emission. Our model can also be applied to other astrophysical environments with radiative shocks, including Type IIn supernovae and colliding winds in massive star binaries.