2023
DOI: 10.1111/mice.13050
|View full text |Cite
|
Sign up to set email alerts
|

A vision monitoring system for multipoint deflection of large‐span bridge based on camera networking

Abstract: This paper proposes a vision monitoring system for multipoint deflection of a large‐span bridge that can well compensate the camera motion–induced errors. The camera network system (CNS) consists of a series of dual‐camera stations that are linked with the cooperative markers. The infrared illumination supplement device is integrated in the camera station and the regular double‐sided prism is selected as the cooperative marker to realize 24‐h continuous monitoring. The feasibility and efficiency of CNS are ver… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
5

Citation Types

0
5
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
8

Relationship

0
8

Authors

Journals

citations
Cited by 15 publications
(5 citation statements)
references
References 67 publications
0
5
0
Order By: Relevance
“…Among them, one-stage detectors have faster recognition speeds and are more suitable for tasks with high real-time requirements. Target detection technology has found extensive applications across various branches of civil engineering, including bolt-loosening detection [11], structural deformation monitoring [12], and more. In the context of identifying the spatiotemporal distribution of vehicles, Xia et al [13] achieved vehicle trajectory recognition under complex driving conditions.…”
Section: Introductionmentioning
confidence: 99%
“…Among them, one-stage detectors have faster recognition speeds and are more suitable for tasks with high real-time requirements. Target detection technology has found extensive applications across various branches of civil engineering, including bolt-loosening detection [11], structural deformation monitoring [12], and more. In the context of identifying the spatiotemporal distribution of vehicles, Xia et al [13] achieved vehicle trajectory recognition under complex driving conditions.…”
Section: Introductionmentioning
confidence: 99%
“…The experimental measure of displacements through video processing for the purpose of structural monitoring in civil engineering infrastructures has attracted much attention in the past decade. Hundreds of research articles, mostly focusing on the dynamic response monitoring of bridges, were reviewed in recent comprehensive analyses of the state-of-theart [1][2][3][4][5] and more articles continue to appear, e.g., [6][7][8][9][10][11][12][13][14][15][16][17][18][19][20][21][22][23]. The reasons for such interest can be explained by considering the appealing characteristics of such technology, namely:…”
Section: Introductionmentioning
confidence: 99%
“…With the use of construction robots or drones, defects such as incorrect dimensions or cracks in construction components can be measured, thus enabling more efficient construction tasks (Shi et al., 2023; Yamaguchi & Mizutani, 2022). Additionally, even after the completion of construction, these technologies can be utilized to monitor the displacement of various structural elements, evaluate the safety and stability of the structure, and facilitate ongoing maintenance of the building (Yin et al., 2023).…”
Section: Introductionmentioning
confidence: 99%
“…With the use of construction robots or drones, defects such as incorrect dimensions or cracks in construction components can be measured, thus enabling more efficient construction tasks (Shi et al, 2023;Yamaguchi & Mizutani, 2022). Additionally, even after the completion of construction, these technologies can be utilized to monitor the displacement of various structural elements, evaluate the safety and stability of the structure, and facilitate ongoing maintenance of the building (Yin et al, 2023). H. S. Park et al (2007) introduced terrestrial laser scanning (TLS) for structural deformation measurements, and this technology continues to be actively researched by scholars in the construction field (Ge et al, 2023;Mirzaei et al, 2023;S.…”
Section: Introductionmentioning
confidence: 99%