The phase coherence limitations of L-band digital correlation radiometry are investigated for receiver architectures that use low A/D converter resolution (1-3 bits). Statistical models and measurements of a 1.4 GHz digital radiometer system show that coarse quantization can cause excess fringe washing losses which degrade the spatial resolution capabilities in synthetic thinned array radiometry (STAR) implementations. For single-bit STAR, excess fringe washing is discernible immediately away from the boresight direction and, further from the center of the image, can result in as much as 2 dB loss in visibility information. To accommodate low-bit correlators in remote sensing STAR, a novel band division correlation (BDC) processor is proposed. BDC improves the time-coherence of each correlated brightness signal while it also maintains the system bandwidth and noise-equivalent sensitivity of a conventional STAR radiometer. Analytical and numerical solutions are presented for the point spread function of a 27 m L-band STAR sensor to evaluate the band-slicing technique. The results show that with 4 subband channels, BDC improves swath edge resolution from 17.0 to 10.2 km and reduces correlation loss from 2.5 to 0.2 dB.