BSTRACTIn this paper, we consider a cumulative
Ⅰ. IntroductionOpportunistic scheduling has been studied extensively in the last decade; it can maximize the sum throughput in wireless networks by selecting the user who has the largest channel gain at each time-slot [1] . As a result, a user having higher signal-to-noise ratio (SNR) on average is scheduled more frequently, which leads to unfairness [2] . A cumulative distribution function (CDF)-based opportunistic scheduling [3] was proposed to resolve the fairness problem while achieving high throughput by utilizing a set of weight parameters, and has been extended to various networks in recent work [1] . Along with the fairness, another important feature inherent in the opportunistic scheduling is starvation: a user may wait for a long time until it experiences peaks in its channel gain, which contrasts with non-opportunistic policies such as the round-robin scheduling. Such starvation can become severe in wireless networks with time-correlated channels. Note that the starvation period (i.e., the length of the time interval between two successive scheduling points of a user) determines the head-of-line packet delay, and thus understanding the starvation period helps to control delay performance.In this paper, we present a closed-form expression for the average starvation period of the CDF-based scheduling over Markov time-varying channels.Through numerical studies, we investigate the starvation period for various system parameters.
Ⅱ. System ModelWe consider downlink in a cellular network consisting of one base station (BS) and mobile stations (MSs). We assume that the BS always has traffic to send to each MS.
Wireless channel model