On present-day magnetic-confinement fusion experiments, the performance of multi-channel bolometer diagnostics has typically evolved over time through experience with earlier versions of the diagnostic and experimental results obtained. For future large-scale fusion experiments and reactors, it is necessary to be able to predict the performance as a function of design decisions and constraints. A methodology has been developed to predict the accuracy with which the volume-integrated total radiated power can be estimated from the measurements by a resistive bolometer diagnostic, considering, in particular, its line-of-sight geometry, étendues of individual lines of sight, bolometer-sensor characteristics, and the expected noise level that can be obtained with its electronics and signal chain. The methodology depends on a number of assumptions in order to arrive at analytical expressions but does not restrict the final implementation of data-processing of the diagnostic measurements. The methodology allows us to predict the performance in terms of accuracy, total-radiated power level, and frequency or time resolution and to optimize bolometer-sensor characteristics for a set of performance requirements. This is illustrated for the bolometer diagnostic that is being designed for the ITER experiment. The reasonableness, consequences, and limitations of the assumptions are discussed in detail.