Abstract:This work introduces a process to develop a tool-independent, high-fidelity, ray tracing-based light detection and ranging (LiDAR) model. This virtual LiDAR sensor includes accurate modeling of the scan pattern and a complete signal processing toolchain of a LiDAR sensor. It is developed as a functional mock-up unit (FMU) by using the standardized open simulation interface (OSI) 3.0.2, and functional mock-up interface (FMI) 2.0. Subsequently, it was integrated into two commercial software virtual environment f… Show more
“…This section will provide an overview of the toolchain and signal processing steps of the LiDAR FMU model. A detailed description of the LiDAR FMU modeling methodology can be found in [ 23 ]. The toolchain and signal processing steps of the LiDAR FMU model are shown in Figure 4 .…”
Section: Devices Under Testmentioning
confidence: 99%
“…Therefore, in this paper, we have evaluated the 3D imaging and point-to-point distance measurement performance of the Blickfeld micro-electro-mechanical systems (MEMS)-based automotive LiDAR sensor and its simulation model (LiDAR FMU) according to the ASTM E3125-17 tests method. It should be noted that same authors have developed the LiDAR FMU model in their previous work [ 23 ] for the simulation-based testing of ADAS, and the presented paper is the continuity of it. The authors want to bring the scientific community’s attention, focusing on developing and validating automotive real and virtual LiDAR sensors to the ASTM E3125-17 standard.…”
Section: Introductionmentioning
confidence: 99%
“…The LiDAR sensor model is developed as a functional mock-up unit (FMU) by using the standardized open simulation interface (OSI) and functional mock-up interface (FMI) [ 23 ]. The model is packaged as an OSI sensor model packaging (OSMP) FMU [ 25 ].…”
Section: Introductionmentioning
confidence: 99%
“…Therefore, it was integrated successfully into the co-simulation environment of CarMaker from IPG Automotive. The virtual LiDAR sensor considers the accurate modeling of scan pattern and complete signal processing toolchain of the Cube 1 LiDAR sensor, as described in [ 23 ].…”
Section: Introductionmentioning
confidence: 99%
“…It measures the round trip delay time (RTDT) that laser light takes to hit an object and return to the detector. With the RTDT , the range R can be calculated as [ 23 ]: where the range is denoted by R , c is the speed of light, and is the RTDT, also known as the time of flight (ToF).…”
Measurement performance evaluation of real and virtual automotive light detection and ranging (LiDAR) sensors is an active area of research. However, no commonly accepted automotive standards, metrics, or criteria exist to evaluate their measurement performance. ASTM International released the ASTM E3125-17 standard for the operational performance evaluation of 3D imaging systems commonly referred to as terrestrial laser scanners (TLS). This standard defines the specifications and static test procedures to evaluate the 3D imaging and point-to-point distance measurement performance of TLS. In this work, we have assessed the 3D imaging and point-to-point distance estimation performance of a commercial micro-electro-mechanical system (MEMS)-based automotive LiDAR sensor and its simulation model according to the test procedures defined in this standard. The static tests were performed in a laboratory environment. In addition, a subset of static tests was also performed at the proving ground in natural environmental conditions to determine the 3D imaging and point-to-point distance measurement performance of the real LiDAR sensor. In addition, real scenarios and environmental conditions were replicated in the virtual environment of a commercial software to verify the LiDAR model’s working performance. The evaluation results show that the LiDAR sensor and its simulation model under analysis pass all the tests specified in the ASTM E3125-17 standard. This standard helps to understand whether sensor measurement errors are due to internal or external influences. We have also shown that the 3D imaging and point-to-point distance estimation performance of LiDAR sensors significantly impacts the working performance of the object recognition algorithm. That is why this standard can be beneficial in validating automotive real and virtual LiDAR sensors, at least in the early stage of development. Furthermore, the simulation and real measurements show good agreement on the point cloud and object recognition levels.
“…This section will provide an overview of the toolchain and signal processing steps of the LiDAR FMU model. A detailed description of the LiDAR FMU modeling methodology can be found in [ 23 ]. The toolchain and signal processing steps of the LiDAR FMU model are shown in Figure 4 .…”
Section: Devices Under Testmentioning
confidence: 99%
“…Therefore, in this paper, we have evaluated the 3D imaging and point-to-point distance measurement performance of the Blickfeld micro-electro-mechanical systems (MEMS)-based automotive LiDAR sensor and its simulation model (LiDAR FMU) according to the ASTM E3125-17 tests method. It should be noted that same authors have developed the LiDAR FMU model in their previous work [ 23 ] for the simulation-based testing of ADAS, and the presented paper is the continuity of it. The authors want to bring the scientific community’s attention, focusing on developing and validating automotive real and virtual LiDAR sensors to the ASTM E3125-17 standard.…”
Section: Introductionmentioning
confidence: 99%
“…The LiDAR sensor model is developed as a functional mock-up unit (FMU) by using the standardized open simulation interface (OSI) and functional mock-up interface (FMI) [ 23 ]. The model is packaged as an OSI sensor model packaging (OSMP) FMU [ 25 ].…”
Section: Introductionmentioning
confidence: 99%
“…Therefore, it was integrated successfully into the co-simulation environment of CarMaker from IPG Automotive. The virtual LiDAR sensor considers the accurate modeling of scan pattern and complete signal processing toolchain of the Cube 1 LiDAR sensor, as described in [ 23 ].…”
Section: Introductionmentioning
confidence: 99%
“…It measures the round trip delay time (RTDT) that laser light takes to hit an object and return to the detector. With the RTDT , the range R can be calculated as [ 23 ]: where the range is denoted by R , c is the speed of light, and is the RTDT, also known as the time of flight (ToF).…”
Measurement performance evaluation of real and virtual automotive light detection and ranging (LiDAR) sensors is an active area of research. However, no commonly accepted automotive standards, metrics, or criteria exist to evaluate their measurement performance. ASTM International released the ASTM E3125-17 standard for the operational performance evaluation of 3D imaging systems commonly referred to as terrestrial laser scanners (TLS). This standard defines the specifications and static test procedures to evaluate the 3D imaging and point-to-point distance measurement performance of TLS. In this work, we have assessed the 3D imaging and point-to-point distance estimation performance of a commercial micro-electro-mechanical system (MEMS)-based automotive LiDAR sensor and its simulation model according to the test procedures defined in this standard. The static tests were performed in a laboratory environment. In addition, a subset of static tests was also performed at the proving ground in natural environmental conditions to determine the 3D imaging and point-to-point distance measurement performance of the real LiDAR sensor. In addition, real scenarios and environmental conditions were replicated in the virtual environment of a commercial software to verify the LiDAR model’s working performance. The evaluation results show that the LiDAR sensor and its simulation model under analysis pass all the tests specified in the ASTM E3125-17 standard. This standard helps to understand whether sensor measurement errors are due to internal or external influences. We have also shown that the 3D imaging and point-to-point distance estimation performance of LiDAR sensors significantly impacts the working performance of the object recognition algorithm. That is why this standard can be beneficial in validating automotive real and virtual LiDAR sensors, at least in the early stage of development. Furthermore, the simulation and real measurements show good agreement on the point cloud and object recognition levels.
Autonomous driving simulators are an effective tool for developing autonomous driving algorithms so that they are widely used in research and development. However, the similarity of the virtual model to reality is closely related to the validity of the simulation results. Therefore, analyzing the characteristics of real sensors is necessary for mathematical modeling of virtual sensors in autonomous driving simulators. This paper presents a virtual lidar that has a high fidelity operating similarly to reality in the sensor modeling. The intensity variable factors which represents the strength of the received signal relative to the transmitted signal are effectively used for improving the fidelity of virtual lidar with a low computing power. The proposed virtual lidar is implemented in an autonomous driving simulator to show its feasibility by comparing with the existing virtual lidar.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.