We show that the method of partial covariance is a very efficient way to introduce constraints (such as the centrality selection) in data analysis in ultra-relativistic nuclear collisions. The technique eliminates spurious event-by-event fluctuations of physical quantities due to fluctuations of control variables. Moreover, in the commonly used superposition approach to particle production the method can be used to impose constraints on the initial sources rather than on the finally produced particles, thus separating out the trivial fluctuations from statistical hadronization or emission from sources and focusing strictly on the initial-state physics. As illustration, we use simulated data from hydrodynamics started on the wounded-quark event-by-event initial conditions, followed with statistical hadronization, to show the practicality of the approach in analyzing the forward-backward multiplicity fluctuations. We mention generalizations to the case with several constraints and other observables, such as the transverse momentum or eccentricity correlations. This talk is based on [1] where more details can be found. The technique of partial covariance is widely used in other areas of science in situations where one can distinguish the physical variables and the control (spurious, nuisance) variables in multivariate statistical samples (see, e.g., [2,3]). Such a separation occurs in typical setups in ultra-relativistic nuclear collisions, where response of certain detectors is used to determine centrality, the quantile from a given measure quantity, which plays the role of a control variable, whereas other quantities correspond to physical variables.The problem of eliminating fluctuations of centrality, spuriously correlating to physical quantities, has a long history with numerous methods,