Abstract:A digital twin is a widely used method that uses digitized simulations of the real-world characteristics because it is effective in predicting results at a low cost. In digital twin analysis, the transfer function between the input and output data is an important research subject. In this study, we intend to investigate the application of the digital twin method to dust particle sensing. A high-performance multichannel reference dust particle sensor provides particle count as well as particulate matter informa… Show more
“…Calibration requires a transfer function, which can vary depending on the sampling frequency and the number of sensor, and the transfer function is often unknown. Lee [6] proposed obtaining the transfer function in the calculation of a transfer matrix using SVD.…”
Section: Light Scattering Methodmentioning
confidence: 99%
“…Lee [6] introduced a methodology employing singular value decomposition (SVD) to replicate the transfer function between particle count (PC) acquired from a cost-effective single-sensor device (referred as the "test device") and the PM values obtained from a high-performance multi-sensor device (referred as the "reference device"). The primary objective was to derive particulate matter (PM) of the test device (denoted as TPM) that align with PM values of the reference device (denoted as RPM).…”
Among various applications of digital twins, measuring particulate matter in the context of air pollution has become increasingly important due to growing interest in atmospheric environments. By obtaining particle count values from analogue‐to‐digital converter raw data through photodiode and calibrating them, particulate matter values can be acquired. The singular‐value–decomposition is efficient transfer function learning method. However, due to the linear nature of the transfer function of singular‐value–decomposition, it also conveys noise information, necessitating post‐processing to suppress noise. This study proposes a method that utilizes long short‐term memory neural networks to effectively stabilize noise based on the output of continuous dust sensor particulate matter distributions. Using the singular value decomposition based particulate matter shows relative root‐mean‐square error (4.4761, 100%) as a reference, the proposed long short‐term memory post‐processing demonstrates an improved result (2.9328, 65.52%) compared to other post‐processing method: the mean filter (3.6704, 82.00%), low‐pass filter (3.7719, 84.27%) and Kalman filter (3.5550, 79.42%). Furthermore, to address the limitation of initial delay in long short‐term memory to achieve stable data output, a method of iteratively training the initial input data sample is proposed. For the initial input data, the non‐initialization approach exhibited a relative error of 87.57% and it took around 5 samples to predict stable output. In contrast, the proposed iterative training method, applied five times to the initial data sample, achieved a relative error of 9.51% and immediately obtained stable data.
“…Calibration requires a transfer function, which can vary depending on the sampling frequency and the number of sensor, and the transfer function is often unknown. Lee [6] proposed obtaining the transfer function in the calculation of a transfer matrix using SVD.…”
Section: Light Scattering Methodmentioning
confidence: 99%
“…Lee [6] introduced a methodology employing singular value decomposition (SVD) to replicate the transfer function between particle count (PC) acquired from a cost-effective single-sensor device (referred as the "test device") and the PM values obtained from a high-performance multi-sensor device (referred as the "reference device"). The primary objective was to derive particulate matter (PM) of the test device (denoted as TPM) that align with PM values of the reference device (denoted as RPM).…”
Among various applications of digital twins, measuring particulate matter in the context of air pollution has become increasingly important due to growing interest in atmospheric environments. By obtaining particle count values from analogue‐to‐digital converter raw data through photodiode and calibrating them, particulate matter values can be acquired. The singular‐value–decomposition is efficient transfer function learning method. However, due to the linear nature of the transfer function of singular‐value–decomposition, it also conveys noise information, necessitating post‐processing to suppress noise. This study proposes a method that utilizes long short‐term memory neural networks to effectively stabilize noise based on the output of continuous dust sensor particulate matter distributions. Using the singular value decomposition based particulate matter shows relative root‐mean‐square error (4.4761, 100%) as a reference, the proposed long short‐term memory post‐processing demonstrates an improved result (2.9328, 65.52%) compared to other post‐processing method: the mean filter (3.6704, 82.00%), low‐pass filter (3.7719, 84.27%) and Kalman filter (3.5550, 79.42%). Furthermore, to address the limitation of initial delay in long short‐term memory to achieve stable data output, a method of iteratively training the initial input data sample is proposed. For the initial input data, the non‐initialization approach exhibited a relative error of 87.57% and it took around 5 samples to predict stable output. In contrast, the proposed iterative training method, applied five times to the initial data sample, achieved a relative error of 9.51% and immediately obtained stable data.
One of the key critical technologies in the digital revolution of measurement technology is digital twin. The literature now in publication indicates that the advancement and use of digital twin technology will raise the bar for improvement in the measuring sector. The current literature on the creation and use of digital twin technology is reviewed first, followed by a list of recognized definitions and a summary of the three main categories of digital twin models for easy reference. The main drawbacks of conventional measurement technology in the application process are enumerated here: direct measurement is challenging, measuring multiple parameters at once is challenging, sensors' influence cannot be disregarded, and the accuracy of measurement results is not satisfactory. To address these issues, this review outlines the benefits and potential uses of digital twin technology in measurement, as well as a summary of six significant contributions. Strong application and robustness, the ability to visualize the process of changing a measurement parameter, simultaneous measurement of many parameters, cheap measurement costs, data security, integrity, high availability, and intelligent measurement are only a few of these features. It is explored where digital twin research in measurement technology is headed in the future. A new digital solution and path for measuring technology development are offered by the digital twin and virtual sensor simulation methods.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.