Abstract:This study investigated using Monte Carlo simulation the interaction between a linear trend and a lag-one autoregressive (AR(1)) process when both exist in a time series. Simulation experiments demonstrated that the existence of serial correlation alters the variance of the estimate of the Mann-Kendall (MK) statistic; and the presence of a trend alters the estimate of the magnitude of serial correlation. Furthermore, it was shown that removal of a positive serial correlation component from time series by pre-whitening resulted in a reduction in the magnitude of the existing trend; and the removal of a trend component from a time series as a first step prior to pre-whitening eliminates the influence of the trend on the serial correlation and does not seriously affect the estimate of the true AR(1). These results indicate that the commonly used pre-whitening procedure for eliminating the effect of serial correlation on the MK test leads to potentially inaccurate assessments of the significance of a trend; and certain procedures will be more appropriate for eliminating the impact of serial correlation on the MK test. In essence, it was advocated that a trend first be removed in a series prior to ascertaining the magnitude of serial correlation. This alternative approach and the previously existing approaches were employed to assess the significance of a trend in serially correlated annual mean and annual minimum streamflow data of some pristine river basins in Ontario, Canada. Results indicate that, with the previously existing procedures, researchers and practitioners may have incorrectly identified the possibility of significant trends.
[1] Prewhitening has been used to eliminate the influence of serial correlation on the MannKendall (MK) test in trend-detection studies of hydrological time series. However, its ability to accomplish such a task has not been well documented. This study investigates this issue by Monte Carlo simulation. Simulated time series consist of a linear trend and a lag 1 autoregressive (AR(1)) process with a noise. Simulation results demonstrate that when trend exists in a time series, the effect of positive/negative serial correlation on the MK test is dependent upon sample size, magnitude of serial correlation, and magnitude of trend. When sample size and magnitude of trend are large enough, serial correlation no longer significantly affects the MK test statistics. Removal of positive AR(1) from time series by prewhitening will remove a portion of trend and hence reduces the possibility of rejecting the null hypothesis while it might be false. Contrarily, removal of negative AR(1) by prewhitening will inflate trend and leads to an increase in the possibility of rejecting the null hypothesis while it might be true. Therefore, prewhitening is not suitable for eliminating the effect of serial correlation on the MK test when trend exists within a time series.
It is known that serial correlation within time series at sites and cross-correlation among sites in a specific region will influence the ability of statistical tests to assess the field significance of trends over the region. However, serial and/or cross-correlation has been ignored in field trend-analyses. This study attempts to develop a methodology that takes into account both serial and cross-correlation in the assessment of the field significance of trends. The regional average Mann-Kendall (RAMK) statistic is used to represent the regional properties of trends at a regional scale. The null distribution of the RAMK statistic is derived on the basis that the joint probability distribution of m independent normal variables is also normally distributed. The variance of the RAMK statistic is then modified by serial and cross-correlation. The applicability of the method was demonstrated by applying it to assess the field significance of trends in annual mean, annual maximum, and annual minimum daily streamflow from 1967 to 1996 in ten major homogeneous climate regions of Canada. The results indicate that the method developed provides more accurate assessment of the field significance of trends than that without consideration of serial and cross-correlation.At the significance level of 0.10, annual mean daily flow increased significantly in the region of Yukon and northern BC mountains whereas it decreased significantly in the Pacific and the Prairie regions. Annual maximum daily flow decreased significantly across southern Canada, except in the Pacific region. Annual minimum daily flow decreased significantly in the Pacific region and in southeastern Canada, with the exception of the region of Great Lakes and St Lawrence river basin, whereas it increased significantly in the region of Yukon and northern BC mountains.
Abstract:Basic concepts such as conditional probability distributions, conditional return periods, and joint return periods are important to understand and to interpret multivariate hydrological events such as floods and storms. However, these concepts are not well documented in the open literature. This paper assembles and clarifies these concepts, and illustrates their practical utility. Relationships between joint return periods and univariate return periods are also derived. These concepts and relationships are demonstrated by applying a bivariate extreme value distribution to represent the joint distribution of flood peak and volume from an actual basin.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.