We study video streaming over a slow fading wireless channel. In a streaming application video packets are required to be decoded and displayed in the order they are transmitted as the transmission goes on. This results in per-packet delay constraints, and the resulting channel can be modeled as a physically degraded fading broadcast channel with as many virtual users as the number of packets. In this paper we study two important quality of user experience (QoE) metrics, namely throughput and inter-decoding delay. We introduce several transmission schemes, and compare their throughput and maximum inter-decoding delay performances. We also introduce a genie-aided scheme, which provides theoretical bounds on the achievable performance. We observe that adapting the transmission rate at the packet level, i.e., periodically dropping a subset of the packets, leads to a good tradeoff between the throughput and the maximum inter-decoding delay. We also show that an approach based on initial buffering leads to an asymptotically vanishing packet loss rate at the expense of a relatively large initial delay. For this scheme we derive a condition on the buffering time that leads to throughput maximization.