All digital data contain error and many are uncertain. Digital models of elevation surfaces consist of files containing large numbers of measurements representing the height of the surface of the earth, and therefore a proportion of those measurements are very likely to be subject to some level of error and uncertainty. The collection and handling of such data and their associated uncertainties has been a subject of considerable research, which has focused largely upon the description of the effects of interpolation and resolution uncertainties, as well as modelling the occurrence of errors. However, digital models of elevation derived from new technologies employing active methods of laser and radar ranging are becoming more widespread, and past research will need to be re-evaluated in the near future to accommodate such new data products. In this paper we review the source and nature of errors in digital models of elevation, and in the derivatives of such models. We examine the correction of errors and assessment of fitness for use, and finally we identify some priorities for future research.
The concept of spatial scale is fundamental to geography, as are the problems of integrating data obtained at different scales. The availability of GIS has provided an appropriate environment to re-scale data prior to subsequent integration, but few tools with which to implement the re-scaling. This sparsity of appropriate tools arises primarily because the nature of the spatial variation of interest is often poorly understood and, specifically, the patterns of spatial dependence and error are unknown. Spatial dependence can be represented and modelled using geostatistical approaches providing a basis for the subsequent re-scaling of spatial data (e.g., via spatial interpolation). Geostatistical techniques can also be used to model the effects of re-scaling data through the geostatistical operation of regularization. Regularization provides a means by which to re-scale the statistics and functions that describe the data rather than the data themselves. These topics are reviewed in this paper and the importance of the spatial scale problems that remain is emphasized.
Debates concerning residential population displacement in the context of gentrification remain vociferous, but are hampered by a lack of empirical evidence of the extent of the displacement occurring. The lack of quantitative evidence on gentrification-induced displacement and the difficulties in collecting it has long hampered the fight against it. Based on a systematic review of quantitative studies of the displacement associated with gentrification, this article considers how researchers have attempted to measure displacement using a range of statistical and mapping techniques reflecting the multi-dimensional character of gentrification. We note that these techniques often struggle to provide meaningful estimates of the number of individuals and households displaced by gentrification, something compounded by the lack of data available on a sufficiently granular temporal and spatial scale. Noting the limitations of extant methods, we conclude by considering the potential of more novel data sources and emergent methods involving the processing of larger amounts of (micro)data, as well as participatory GIS methods that involve affected communities themselves. This implies that whilst the quantitative study of displacement remains difficult, patterns and processes of displacement can be inferred through existing data sources, as well as data generated from those who themselves have experienced displacement.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.