This study presents a deep learning (DL) neural network hybrid data-driven method that is able to predict turbulence flow velocity field. Recently many studies have reported the application of recurrent neural network (RNN) methods, particularly the Long short-term memory (LSTM) for sequential data. The airflow around the objects and wind speed are the most presented with different hybrid architecture. In some of them, the data series is used with the known equation, and the data is firstly generated. Data series extracted from Computational Fluid Dynamics (CFD) have been used in many cases. This work aimed to determine a method with raw data that could be measured with devices in the airflow, wind tunnel, water flow in the river, wind speed and industry application to process in the DL model and predict the next time steps. This method suggests spatialtemporal data in time series, which matches the Lagrangian framework in fluid dynamics. Gated Recurrent Unit (GRU), the next generation of LSTM, has been employed to create a DL model and forecasting. Time series data source is from turbulence flow has been generated in a laboratory and extracted via 2D Lagrangian Particle Tracking (LPT). This data has been used for the training model and to validate the prediction in the suggested approach. The achievement via this method dictates a significant result and could be developed.
This study aimed to simulate straining turbulent flow empirically, having direct similarities with vast naturally occurring flows and engineering applications. The flow was generated in 100<Reλ<500 and seeded with passive and inertial particles. Lagrangian particle tracking and particle image velocimetry were employed to extract the dynamics of particle statistics and flow features, respectively. The studies for axisymmetric straining turbulent flow reported that the strain rate, flow geometry, and gravity affect particle statistics. To practically investigate mentioned effects in the literature, we present the behavior of both passive and inertial particles from the novel experiment conducted on initially homogeneous turbulence undergoing a sudden axisymmetric expansion. We represent the result with two different mean strains and Reynolds-Taylor microscales. However, this study, in contrast to the previous studies, considers the fields of inertial particles in the presence of gravity. The result discloses that the novel designed and conducted experiments simulated the flow satisfactorily. Then the particle behavior in such flow showed the effectiveness of the flow distortion on particle dynamics such as velocity root mean square (RMS) and Reynolds-stress. Straining turbulence flow is subject to many industrial applications and physics studies, such as stagnation points, external flow around an airfoil, internal flow in changeable cross-section pipe, expansion in the engine mixing chamber, and leading edge erosion. This study's conclusion could apply constructively to these areas.
The subject of this study presents an employed method in deep learning to create a model and predict the following period of turbulent flow velocity. The applied data in this study are extracted datasets from simulated turbulent flow in the laboratory with the Taylor microscale Reynolds numbers in the range of 90 < Rλ< 110. The flow has been seeded with tracer particles. The turbulent intensity of the flow is created and controlled by eight impellers placed in a turbulence facility. The flow deformation has been conducted via two circular flat plates moving toward each other in the center of the tank. The Lagrangian particle-tracking method has been applied to measure the flow features. The data have been processed to extract the flow properties. Since the dataset is sequential, it is used to train long short-term memory and gated recurrent unit model. The parallel computing machine DEEP-DAM module from Juelich supercomputer center has been applied to accelerate the model. The predicted output was assessed and validated by the rest of the data from the experiment for the following period. The results from this approach display accurate prediction outcomes that could be developed further for more extensive data documentation and used to assist in similar applications. The mean average error and R2 score range from 0.001–0.002 and 0.9839–0.9873, respectively, for both models with two distinct training data ratios. Using GPUs increases the LSTM performance speed more than applications with no GPUs.
Turbulent flow is a complex and vital phenomenon in fluid dynamics, as it is the most common type of flow in both natural and artificial systems. Traditional methods of studying turbulent flow, such as computational fluid dynamics and experiments, have limitations such as high computational costs, experiment costs, and restricted problem scales and sizes. Recently, artificial intelligence has provided a new avenue for examining turbulent flow, which can help improve our understanding of its flow features and physics in various applications. Strained turbulent flow, which occurs in the presence of gravity in situations such as combustion chambers and shear flow, is one such case. This study proposes a novel data-driven transformer model to predict the velocity field of turbulent flow, building on the success of this deep sequential learning technique in areas such as language translation and music. The present study applied this model to experimental work by Hassanian et al., who studied distorted turbulent flow with a specific range of Taylor microscale Reynolds numbers 100<Reλ<120. The flow underwent a vertical mean strain rate of 8 s−1 in the presence of gravity. The Lagrangian particle tracking technique recorded every tracer particle's velocity field and displacement. Using this dataset, the transformer model was trained with different ratios of data and used to predict the velocity of the following period. The model's predictions significantly matched the experimental test data, with a mean absolute error of 0.002–0.003 and an R2 score of 0.98. Furthermore, the model demonstrated its ability to maintain high predictive performance with less training data, showcasing its potential to predict future turbulent flow velocity with fewer computational resources. To assess the model, it has been compared to the long short-term memory and gated recurrent units model. High-performance computing machines, such as JUWELS-DevelBOOSTER at the Juelich Supercomputing Center, were used to train and run the model for inference.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.