2013
DOI: 10.1007/s12205-013-0387-9
|View full text |Cite
|
Sign up to set email alerts
|

Detection of lateral hazardous driving events using in-vehicle gyro sensor data

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
7
0
1

Year Published

2018
2018
2023
2023

Publication Types

Select...
9
1

Relationship

0
10

Authors

Journals

citations
Cited by 15 publications
(9 citation statements)
references
References 5 publications
0
7
0
1
Order By: Relevance
“…Impressive training accuracy of 99.9% and validation accuracy of 100% were attained by the RNN-LSTM model. (Jeong et al, 2013) conducted a study using data from the in-car gyro sensor to identify lateral risky driving incidents. They equipped a probe vehicle with a customized data collection setup.…”
Section: Speed and Velocity Sensorsmentioning
confidence: 99%
“…Impressive training accuracy of 99.9% and validation accuracy of 100% were attained by the RNN-LSTM model. (Jeong et al, 2013) conducted a study using data from the in-car gyro sensor to identify lateral risky driving incidents. They equipped a probe vehicle with a customized data collection setup.…”
Section: Speed and Velocity Sensorsmentioning
confidence: 99%
“…Omerustaoglu et al combined in-car data and image data to study distracted driving behaviors through deep learning [ 11 ]. Jeong et al used the data collected by the built-in 3-axis gyroscope of the vehicle to identify two driving behaviors by support vector machine [ 12 ]. Other studies described various aggressive driving behaviors and formulated their standards (Tasca [ 13 ], Abou-Zeid [ 14 ], Li [ 15 ], Yang [ 16 ]).…”
Section: Related Workmentioning
confidence: 99%
“…Xu and Wang (2020) provided a safety prewarning mechanism, which can automatically extract safety information from surveillance cameras based on computer vision, assess risks based on the embedded comprehensive risk assessment model, categorize risks into five levels and provide timely suggestions. Jeong et al (2021) developed a computer vision-based solitary work detection model that considers interactive operations between heavy equipment and spotters. Li et al (2022) took a reinforcement processing area as a research case and proposed a new method for recognizing each worker's activity through the position relationship of objects detected by Faster R-CNN.…”
Section: Literature Review 21 Application Of Computer Vision In Const...mentioning
confidence: 99%