The establishment of the terrestrial laser scanner changed the analysis strategies in engineering geodesy from point-wise approaches to areal ones. During recent years, a multitude of developments regarding a laser scanner-based geometric state description were made. However, the areal deformation analysis still represents a challenge. In this paper, a spatio-temporal deformation model is developed, combining the estimation of B-spline surfaces with the stochastic modelling of deformations. The approach's main idea is to model the acquired measuring object by means of three parts, similar to a least squares collocation: a deterministic trend, representing the undistorted object, a stochastic signal, describing a locally homogeneous deformation process, and the measuring noise, accounting for uncertainties caused by the measuring process. Due to the stochastic modelling of the deformations in the form of distance-depending variograms, the challenge of defining identical points within two measuring epochs is overcome. Based on the geodetic datum defined by the initial trend surface, a pointto-surface-and a point-to-point-comparison of the acquired data sets is possible, resulting in interpretable and meaningful deformation metrics. Furthermore, following the basic ideas of a least squares collocation, the deformation model allows a time-related space-continuous description as well as a space-and time-continuous prediction of the deformation. The developed approach is validated using simulated data sets, and the respective results are analysed and compared with respect to nominal surfaces.
Due to the establishment of terrestrial laser scanner, the analysis strategies in engineering geodesy change from pointwise approaches to areal ones. These areal analysis strategies are commonly built on the modelling of the acquired point clouds.Freeform curves and surfaces like B-spline curves/surfaces are one possible approach to obtain space continuous information. A variety of parameters determines the B-spline’s appearance; the B-spline’s complexity is mostly determined by the number of control points. Usually, this number of control points is chosen quite arbitrarily by intuitive trial-and-error-procedures. In this paper, the Akaike Information Criterion and the Bayesian Information Criterion are investigated with regard to a justified and reproducible choice of the optimal number of control points of B-spline curves. Additionally, we develop a method which is based on the structural risk minimization of the statistical learning theory. Unlike the Akaike and the Bayesian Information Criteria this method doesn’t use the number of parameters as complexity measure of the approximating functions but their Vapnik-Chervonenkis-dimension. Furthermore, it is also valid for non-linear models. Thus, the three methods differ in their target function to be minimized and consequently in their definition of optimality.The present paper will be continued by a second paper dealing with the choice of the optimal number of control points of B-spline surfaces.
The paper presents a review of testing methods and a classification of strategies and tools in terms of technologies and techniques applied to the monitoring of tunnels. In particular, the topic is contextualized through a brief introduction in Chapter 1, followed by defect taxonomy and degradation mechanisms in Chapters 2 and 3, respectively. Chapters 4 and 5 are related to monitoring strategies and technologies. The former consists of purpose‐based categorization of monitoring policies, while the latter consists of classification of monitoring methods including nondestructive and semidestructive techniques as well as of classification of various types of sensors also based on the physical or chemical quantity measured. General rules of implementation and operation of tunnel monitoring systems are presented taking into account international expert knowledge as well as contemporary practical experience in Austria. Considered issues are related to the fib Model Code 2020 (MC2020) focused on evaluation of structural performance assisted by monitoring and testing. Chapter 6 presents challenges related to the monitoring implementation and operation. Chapter 7 discusses about monitoring characteristics in new tunnel, including data acquisition‐transmission and specific monitoring techniques. Chapter 8 instead treats a particular topic related to considerations related to monitoring characteristics of existing tunnel under investigation. Concluding remarks and references finally close the paper.
Freeform surfaces like B-splines have proven to be a suitable tool to model laser scanner point clouds and to form the basis for an areal data analysis, for example an areal deformation analysis.A variety of parameters determine the B-spline's appearance, the B-spline's complexity being mostly determined by the number of control points. Usually, this parameter type is chosen by intuitive trial-and-error-procedures.In [The present paper continues these investigations. If necessary, the methods proposed in [The application of those methods to B-spline surfaces reveals the datum problem of those surfaces, meaning that location and number of control points of two B-splines surfaces are only comparable if they are based on the same parameterization. First investigations to solve this problem are presented.
Abstract:The topic of indoor positioning and indoor navigation by using observations from smartphone sensors is very challenging as the determined trajectories can be subject to significant deviations compared to the route travelled in reality. Especially the calculation of the direction of movement is the critical part of pedestrian positioning approaches such as Pedestrian Dead Reckoning ("PDR"). Due to distinct systematic effects in filtered trajectories, it can be assumed that there are systematic deviations present in the observations from smartphone sensors. This article has two aims: one is to enable the estimation of partial redundancies for each observation as well as for observation groups. Partial redundancies are a measure for the reliability indicating how well systematic deviations can be detected in single observations used in PDR. The second aim is to analyze the behavior of partial redundancy by modifying the stochastic and functional model of the Kalman filter. The equations relating the observations to the orientation are condition equations, which do not exhibit the typical structure of the Gauss-Markov model ("GMM"), wherein the observations are linear and can be formulated as functions of the states. To calculate and analyze the partial redundancy of the observations from smartphone-sensors used in PDR, the system equation and the measurement equation of a Kalman filter as well as the redundancy matrix need to be derived in the Gauss-Helmert model ("GHM"). These derivations are introduced in this article and lead to a novel Kalman filter structure based on condition equations, enabling reliability assessment of each observation.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.