2021
DOI: 10.1016/j.optlaseng.2021.106695
|View full text |Cite
|
Sign up to set email alerts
|

Deep learning integral imaging for three-dimensional visualization, object detection, and segmentation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
8
0

Year Published

2022
2022
2025
2025

Publication Types

Select...
8

Relationship

0
8

Authors

Journals

citations
Cited by 10 publications
(9 citation statements)
references
References 33 publications
0
8
0
Order By: Relevance
“…In a recent study [29], a deep learning integral imaging system was proposed that can reconstruct a 3D object without dealing with the out-of-focus (blurred) areas that occur in the Integral-Imaging computationally reconstructed depth planes. Targets in the scene are first detected and segmented in the 2D elemental images using a pre-trained Mask R-CNN.…”
Section: Previous Related Workmentioning
confidence: 99%
“…In a recent study [29], a deep learning integral imaging system was proposed that can reconstruct a 3D object without dealing with the out-of-focus (blurred) areas that occur in the Integral-Imaging computationally reconstructed depth planes. Targets in the scene are first detected and segmented in the 2D elemental images using a pre-trained Mask R-CNN.…”
Section: Previous Related Workmentioning
confidence: 99%
“…A software platform based on machine-learning architecture and driven by AI principles that mimic human intelligence to learn not only from data fed into the system, but also real-time data is the ultimate goal of the AAF. These systems could improve target recognition and tracking capabilities to analyse data from sensors and other sources (Yi et al, 2021), and make decisions about when and how to engage targets. The system should be able to accomplish the following basic functions:…”
Section: Designmentioning
confidence: 99%
“…A software platform based on machine-learning architecture and driven by AI principles that mimic human intelligence to learn not only from data fed into the system, but also real-time data is the ultimate goal of the AAF. These systems could improve target recognition and tracking capabilities to analyse data from sensors and other sources (Yi et al, 2021), and make decisions about when and how to engage targets. The system should be able to accomplish the following basic functions: authenticate whether the person holding the gun is authorized to use it, and if not, lock the weapon while sending out an unauthorized use warning to the relevant authorities; track the position of the gun using GPS, gyroscopic data and Bluetooth; record the basic characteristics, body position, actions and non-verbal cues of the target; measure the height of the target, the distance from the shooter and the trajectory of projectiles (whether smart ammunition or traditional); record, detect, identify and label every object in its field of view; stitch together 360-degree images and video inside the gun to prevent tampering; use advanced threat-detection algorithms, such as real-time emergency detection models built on machine-learning infrastructures; efficiently process data and determine outcomes with the ability to display information swiftly in a readable and understandable format; smartly integrate battery-saving features, such as sleep mode (when idle; holstered) and quick wake (drawn); monitor ammunition and alert when the firearm needs reloading. A conceptual drawing of an AAF is presented in Figure 1.…”
Section: Conceptualizing the Future Police Firearmmentioning
confidence: 99%
“…Target detection based on deep learning has been a research hot spot over the past few years and has aroused wide attention, 1 and relevant technologies have been applied to face recognition, 2 car driverless, 3,4 pedestrian tracking, 5 intelligent transportation, 6 and other fields. Target detection 7 technology has been found as a key technology of Internet of Things in the livestock industry, which is a vital means to monitor the location, number, health status, as well as whether the livestock are in estrus in real time; this technology also lays a basis for achieving intelligent pasture.…”
Section: Introductionmentioning
confidence: 99%