2021 Telecoms Conference (ConfTELE) 2021
DOI: 10.1109/conftele50222.2021.9435580
|View full text |Cite
|
Sign up to set email alerts
|

An Overview of LiDAR Requirements and Techniques for Autonomous Driving

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
3
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
2
1

Relationship

1
7

Authors

Journals

citations
Cited by 27 publications
(6 citation statements)
references
References 43 publications
0
3
0
Order By: Relevance
“…The difference between the two training objectives will cause that the target before and after can not achieve the best match, and the detection accuracy can not be greatly improved. For example, the loss function in the depth estimation algorithm is to solve the difference between all pixels with truth values and their predicted depth values [7]. The loss in this part is the error of solving all pixels with truth values without difference, rather than focusing on the target.…”
Section: Spatial Attentionmentioning
confidence: 99%
“…The difference between the two training objectives will cause that the target before and after can not achieve the best match, and the detection accuracy can not be greatly improved. For example, the loss function in the depth estimation algorithm is to solve the difference between all pixels with truth values and their predicted depth values [7]. The loss in this part is the error of solving all pixels with truth values without difference, rather than focusing on the target.…”
Section: Spatial Attentionmentioning
confidence: 99%
“…Lasers generating nanosecond optical pulses with high pulse powers are key components in time-of-flight (ToF) LIDAR systems that provide fast and reliable 3D-maps of the environment to be used for, e.g., autonomous movement of vehicles or robots, or industrial sensing [1][2][3]. The requirements, the system must fulfill for deployment, range from, for example, narrow-band emission in an atmospheric transparency window (e.g.…”
Section: Introductionmentioning
confidence: 99%
“…TOF-based sensors utilize the time and phase differences between emission and reflected light to estimate the distance between the object and the sensor. They are more suitable candidates for large FOVs and depth ranges, which is why LiDARs, one type of TOF-based depth imaging modality, predominate the autonomous driving field [ 8 ]. However, TOF sensors, such as Intel Real Sense L515, have low depth reconstruction accuracy, as shown in Figure 1 , making them unsuitable for industrial lines where products are small and require feature recognition in millimeter resolutions.…”
Section: Introductionmentioning
confidence: 99%