“…Table 1provides a comparative overview of the influence of weather conditions by sensor type. Recently, multi-sensor-based traffic monitoring systems have begun to gain more traction [28], [29], [33]- [35]. The primary motivation behind this trend is to improve detection accuracy and mitigate the shortcomings in complex, changing environments and weather conditions [30]- [32].…”
Traffic monitoring systems featuring robust, multi-sensor fusion capabilities are rapidly growing in demand to observe traffic flow, reduce congestion and to detect and report traffic accidents. However, monitoring outdoor environments using cameras remains challenging due to complex weather conditions, including fog, rain, snow and variable lighting conditions. The presence of these weather conditions can significantly reduce vehicle detection and classification performance using machine learning methods. Unfortunately, openly available datasets for multi-sensor traffic monitoring development and testing remain limited, especially those featuring infrastructure-based cameras and millimeter wave (mmWave) radar. To address these challenges, we evaluate open camera and mmWave radar data using vehicle classification models for cars, trucks, vans and buses on embedded hardware. We also provide an open multi-sensor traffic monitoring dataset with more than 8,000 manually annotated frames as well as mmWave radar point clouds recorded in an urban environment under sunny, partially cloudy, cloudy, rainy and night conditions.
“…Table 1provides a comparative overview of the influence of weather conditions by sensor type. Recently, multi-sensor-based traffic monitoring systems have begun to gain more traction [28], [29], [33]- [35]. The primary motivation behind this trend is to improve detection accuracy and mitigate the shortcomings in complex, changing environments and weather conditions [30]- [32].…”
Traffic monitoring systems featuring robust, multi-sensor fusion capabilities are rapidly growing in demand to observe traffic flow, reduce congestion and to detect and report traffic accidents. However, monitoring outdoor environments using cameras remains challenging due to complex weather conditions, including fog, rain, snow and variable lighting conditions. The presence of these weather conditions can significantly reduce vehicle detection and classification performance using machine learning methods. Unfortunately, openly available datasets for multi-sensor traffic monitoring development and testing remain limited, especially those featuring infrastructure-based cameras and millimeter wave (mmWave) radar. To address these challenges, we evaluate open camera and mmWave radar data using vehicle classification models for cars, trucks, vans and buses on embedded hardware. We also provide an open multi-sensor traffic monitoring dataset with more than 8,000 manually annotated frames as well as mmWave radar point clouds recorded in an urban environment under sunny, partially cloudy, cloudy, rainy and night conditions.
“…In recent years, the rapid advancement of millimeter-wave technology has brought cost-effective, compact, and powerful millimeter-wave sensors into the mainstream, with millimeter-wave radar being the most widely used among them. Millimeter-wave radar is not only capable of measuring target distance, velocity, angle, positioning [1] and tracking [2] information but also finds extensive applications in human detection [3], behavior recognition [4], [5], micro-motion detection [6], and vital sign monitoring [7], [8].Among, one of the primary reasons millimeter-wave radar can achieve…”
Millimetre-wave frequency-modulated continuous-wave (FMCW) radar is widely used in various scenarios. However, it is often affected by static and dynamic clutter interference, which has a negative impact on its performance. Specifically, these clutter signals are often mistaken for target signals, leading to false detection and affecting the accuracy of target tracking and localization. In addition, dynamic clutter sources, such as other moving objects, also bring about Doppler frequency shift interference, further affecting the measurement of target velocity. In this paper, addressing the issue of static clutter, we propose a frame mean subtraction method. Additionally, for the more complex problem of dynamic clutter, we introduce a filtering approach guided by distance-Doppler information. This method utilizes a mask generated in real-time by tracking the temporal distance information of the target as prior information for filtering radar signals. Subsequently, we employ a novel fractional short-time Fourier transform to extract the Doppler feature spectrogram of the radar signal. Finally, a ResNet-50 model trained on the Doppler spectrograms of interference-free radar signals is used to test the Doppler maps generated from the filtered radar signals. After testing, the classification accuracy reaches 97.5%. This result shows that the micro-Doppler spectrum obtained by filtering the radar signal collected in complex scenes using the proposed method is highly similar to the micro-Doppler spectrum of the target to be measured. In addition, the proposed filtering method not only plays the role of signal filtering, but also enhances the strength of the target signal and provides more detailed information for the subsequent recognition task.INDEX TERMS LFMCW, micro-doppler, radar signal filtering, convolutional neural network.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.