Raw sonar images may not be used for underwater detection or recognition directly because disturbances such as the grating-lobe and multi-path disturbance affect the gray-level distribution of sonar images and cause phantom echoes. To search for a more robust segmentation method with a reasonable computational cost, a prior-knowledge-based threshold segmentation method of underwater linear object detection is discussed. The possibility of guiding the segmentation threshold evolution of forward-looking sonar images using prior knowledge is verified by experiment. During the threshold evolution, the collinear relation of two lines that correspond to double peaks in the voting space of the edged image is used as the criterion of termination. The interaction is reflected in the sense that the Hough transform contributes to the basis of the collinear relation of lines, while the binary image generated from the current threshold provides the resource of the Hough transform. The experimental results show that the proposed method could maintain a good tradeoff between the segmentation quality and the computational time in comparison with conventional segmentation methods. The proposed method redounds to a further process for unsupervised underwater visual understanding.
To generate a seamless mosaic of a Forward-Looking Sonar (FLS) video sequence, this study proposes a novel fusion method for FLS image mosaic, which includes two main steps from coarse to fine. In the coarse fusion step, the source images are first decomposed into multi-scale sub-bands using the Non-Subsampled Contourlet Transform (NSCT), then the sub-bands of the source images are merged, based on the Gabor energy and the local contrast for low and high frequency sub-bands respectively, to generate the fused sub-bands which are used to produce the fused image by inverse NSCT. In the fine fusion step, a decision map is used to choose the pixels of the fine fused image from the coarsely fused image or one of the source images. This decision map is first constructed by measuring the similarity of the coarsely fused image to the source images, and then processed by a morphological post-processing technique to ensure its continuity and smoothness. Extensive experiments on FLS image fusion and mosaic have been conducted to demonstrate the effectiveness and the superiority of the proposed technique using both a subjective evaluation and objective metrics.
Recently, an improved motion compensation method that uses the sum of absolute differences (SAD) has been applied to frame persistence utilized in conventional ultrasonic imaging because of its high accuracy and relative simplicity in implementation. However, high time consumption is still a significant drawback of this space-domain method. To seek for a more accelerated motion compensation method and verify if it is possible to eliminate conventional traversal correlation, motion-compensated speckle tracking between two temporally adjacent B-mode frames based on particle filtering is discussed. The optimal initial density of particles, the least number of iterations, and the optimal transition radius of the second iteration are analyzed from simulation results for the sake of evaluating the proposed method quantitatively. The speckle tracking results obtained using the optimized parameters indicate that the proposed method is capable of tracking the micromotion of speckle throughout the region of interest (ROI) that is superposed with global motion. The computational cost of the proposed method is reduced by 25% compared with that of the previous algorithm and further improvement is necessary.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.