A phase-sensitive optical time-domain reflectometry (∅-OTDR) implements high-sensitivity vibration measurement by measuring vibration induced phase change of Rayleigh scattering from the sensing fiber. Minimum detectable vibration of a ∅-OTDR is limited by the noise of the phase measurement. In this paper, polarization dependence of the noise of phase measurement in a ∅-OTDR is investigated theoretically and experimentally. The correspondence between the intensity of Rayleigh scattering and the polarization state of the probe pulse in a ∅-OTDR is analyzed considering inner interference of coherent Rayleigh light scattered within pulse duration and polarization dependence of propagation constant. Experiments are performed and the results confirm the predictions of the theoretical analysis. This study is essential for acquiring insight into polarization dependence of vibration measurement based on a ∅-OTDR, and would provide a feasible method to eliminate the effect of polarization fading and optimize the minimum detectable vibration of a ∅-OTDR.