We show experimentally that the output light of a single-mode semiconductor laser is changed into chaotic light after passage through a narrow spectral filter. Simple linear filtering allows us to distinguish between different statistical models of laser phase noise. The experimental results agree with the phase-diffusion model.PACS numbers: 42.50.Ar, 42.50.Lc, 42.55.Px, 42.60.Mi It has been pointed out by Armstrong [1] as early as 1966 that laser phase noise can be converted into intensity noise when the laser light is passed through a dispersive structure such as an optical interferometer. This notion has been appreciated in fiber optics [2], laser physics [3,4], and laser spectroscopy [5][6][7]. As an example, in fiber-optic interferometric sensors [8] and signal processors [9], the optical performance is mainly limited by an excessive amount of intensity noise, which results from interferometric conversion of laser phase noise. Since the phase-induced intensity noise is directly proportional to the input power it can severely limit the dynamic range of these systems. Conversion of phase noise into intensity noise may also be used on purpose: Phase-noise characteristics of a laser can be studied [3] and manipulated [4] by means of dispersive elements. In laser spectroscopy an atom instead of an interferometer acts as the resonant system; the phase fluctuations of the laser source can then lead to fluctuations in the resonance fluorescence of the laser-excited atoms [5,7,10].One of the most interesting predictions of Armstrong [l] has not yet been verified, to our knowledge. Here we refer to the prediction that laser light with a Schawlow-Townes or quantum-limited linewidth Acoi will be converted into chaotic light when passed through a spectral filter with bandwidth ACQ/ if Ao)f<^A(Oi. Alternatively phrased, the prediction is that light without intensity fluctuations, corresponding to a source with Poissonian photon statistics (e.g., a laser), is converted into light with an exponential intensity distribution, corresponding to Bose-Einstein photon statistics. Semiconductor lasers are ideally suited for experimental verification of this effect. The high gain and small dimensions of these lasers give them a relatively large Schawlow-Townes linewidth (typically 10-100 MHz at 1 mW output power). We report here on the effect of spectral filtering using the amplitude-stabilized field of a semiconductor laser as input. Furthermore, we extend the theoretical description of spectral filtering beyond that given by Armstrong [1]. This allowed us to distinguish experimentally between different statistical models of laser phase noise. For completeness we note that such distinction has been discussed theoretically for the case of resonance fluorescence of two-level atoms [6]. Note also that effects of spectral filtering on a randomly amplitude-fluctuating field (instead of an amplitude-stabilized field) have been observed by Klimeck, Elliot, and Hamilton [11].The field of a laser can be modeled as a coherent state which...