For the investigation of turbulence and particles, interaction measurement systems are required, which are able to measure velocity and concentration fluctuations simultaneously. Acoustic Doppler Velocimeters (ADV) are widely used for velocity and turbulence measurements in natural and artificial flows. Based on the acoustic sonar theory, a model is presented that correlates the ADV’s Signal-to-Noise Ratio (SNR) and the suspended solids concentration of several natural (Ems Estuary, Lake Eixendorf, Lake Altmühl) and artificial sediments (Chinafill, quartz powder, bentonite, metakaolin) for the range 0.001 g/L–50 g/L. Within the presented method, the sound absorption in water and on particles is considered in a continuous approach for sampling frequencies up to 100 Hz. The widely-used log-linear relation between the SNR and the concentration, which is only valid for low concentrations, was extended for the high concentration regime. Measurement results show a similar behavior of the SNR with respect to varying suspended solid concentrations for different sediments. However, the analysis of the fit parameters shows systematic differences depending on the type of sediment. It is concluded that the proposed model is applicable as well for laboratory use as for measurements in rivers and estuaries. Finally, we discuss the reliability of the results and the methodology with regard to measurements in rivers, lakes, and estuaries.
Climate change is already affecting high mountain regions, such as the European Alps. Those regions will be confronted with a significant rise of temperatures above the global average, and more and heavier rain events, also during wintertime. The system response to the coincidence of rain, snow, and possibly frozen soil depends on the almost infinite number of possible combinations of thermo-hydraulic states of the involved phases. Landslides, snow avalanches, debris flows, or extensive surface runoff are just a few of the possible hazardous outcomes. With rising temperatures and increased precipitation, those hazardous outcomes are expected to occur even more frequently in the future, requiring a better understanding of those coupled processes for hazard mitigation strategies. The macroscopic phenomena are controlled by porescale processes, such as water freezing and ice grains blocking pores, which are only barely understood. The strong coupling between thermal state and hydraulic parameters, the possible phase change, and material heterogeneity pose great challenges for investigation. This work provides an overview of documented hazard events regarding rain, snow, and possibly frozen soil. The current state in theoretical and experimental research is presented before several knowledge gaps are derived and possible techniques to address those gaps are discussed.
Racks retain debris in wastewater treatment plants and shield sensitive machinery in numerous engineering applications. In open-channel flows, racks impound the upstream water level by posing local obstacles in the flow. Based on experimental investigations, empirical approaches usually predict the flow resistance by relying on Bernoulli’s energy principle. Since this principle does not correctly consider downstream conditions such as submerged flows, we present a more accurate workflow to determine the flow resistance based on the Saint-Venant equations. We will demonstrate how the loss coefficient and the hydraulic head loss are determined more reliably without adding complexity to the engineering realisation. In contrast to relying solely on two cross-sections with Bernoulli’s energy principle, applying the Saint-Venant equations enables determining the flow depth profile and the flow velocity in the entire channel. This workflow additionally allows predicting the channel’s hydraulic capacity and freeboard in arbitrary applications.
<p>Several powerful physics-based computational landslide run-out models have been developed and validated throughout the last years. The geohazards community applies these forward models in simulation tools to predict potential landslide run-out outcomes including their uncertainties, and uses inverse approaches to conduct reanalyses and to infer on model parameters for calibration purposes. Yet it remains challenging to turn these computational frameworks into robust, transparent and transferrable simulation-based decision support tools for geohazard mitigation. In particular, the landscape of uncertainties &#8211; such as those resulting from the idealised model description itself, input data (e.g., material parameters or topographic data), and numerical scheme related hyperparameters &#8211; is still not systematically managed when conducting landslide simulations. Probabilistic hazard maps that take these uncertainties into account imply a large number of model evaluations, which constitutes a computational bottle neck. This issue can be overcome by using High Performance Computing (HPC) resources along with the existing software and resources. Alternatively, physics-informed machine learning strategies use simulation results of the original process model, i.e., the simulator, to train a statistically valid representation, the so-called emulator. Once being trained, the emulator significantly reduces computational costs, while at the same time it grants access to an estimation of the introduced error. A software framework has recently been set up to integrate Gaussian process emulation and the landslide run-out model r.avaflow, an open-source mass flow simulation tool. Emulation-based sensitivity analysis was of comparable quality to conventional studies, and the computational costs were cut significantly. The emulator allowed for the first time to conduct a global sensitivity analyses at every location simultaneously for a complete landslide impact area. A joint effort across different institutes in Europe has been made in this contribution to test the potential and limitation of the emulation techniques by revisiting a number of published case studies. Selection of test cases has been made according to data availability, failure type and computational demand. Preliminary findings suggest that the emulator is capable of reducing the computational effort of modelling various flow-like landslides substantially. Future work will focus on curating a well-defined database of test scenarios across multiple institutes with cases ranging from small to medium-sized debris flows to large rock avalanches.</p>
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.