We study the redshift evolution of the luminosity function (LF) and redshift selection effect of long gamma-ray bursts (LGRBs). The method is to fit the observed peak flux and redshift distributions, simultaneously. To account for the complex triggering algorithm of Swift, we use a flux triggering efficiency function. We find evidence supporting an evolving LF, where the break luminosity scales as L b ∝ (1 + z) τ , with τ = 3.5 +0.4 −0.2 and τ = 0.8 +0.1 −0.08 for two kind of LGRB rate models. The corresponding local GRB rates arė R(0) = 0.86 +0.11 −0.08 yr −1 Gpc −3 andṘ(0) = 0.54 +0.25 −0.07 yr −1 Gpc −3 , respectively. Furthermore, by comparing the redshift distribution between the observed one and our mocked one, we find that the redshift detection efficiency of the flux triggered GRBs decreases with redshift. Especially, a great number of GRBs miss their redshifts in the redshift range of 1 < z < 2.5, where "redshift desert" effect may be dominated. More interestingly, our results show that the "redshift desert" effect is mainly introduced by the dimmer GRBs, e.g., P < 10 −7 erg/ s/ cm 2 , but has no effect on the brighter GRBs.