IntroductionGiven some exemplars, few-shot object counting aims to count the corresponding class objects in query images. However, when there are many target objects or background interference in the query image, some target objects may have occlusion and overlap, which causes a decrease in counting accuracy.MethodsTo overcome the problem, we propose a novel Hough matching feature enhancement network. First, we extract the image feature with a fixed convolutional network and refine it through local self-attention. And we design an exemplar feature aggregation module to enhance the commonality of the exemplar feature. Then, we build a Hough space to vote for candidate object regions. The Hough matching outputs reliable similarity maps between exemplars and the query image. Finally, we augment the query feature with exemplar features according to the similarity maps, and we use a cascade structure to further enhance the query feature.ResultsExperiment results on FSC-147 show that our network performs best compared to the existing methods, and the mean absolute counting error on the test set improves from 14.32 to 12.74.DiscussionAblation experiments demonstrate that Hough matching helps to achieve more accurate counting compared with previous matching methods.
Visual correspondence refers to building dense correspondences between two or more images of the same category. Ideally, the predicted keypoints output by the model can be back to the source image's keypoints through the same type of network. However, in practical situations, the predicted keypoints usually do not perfectly map back to the source image keypoints. In order to strengthen the cycle-consistency of the model, we propose a cycle-consistent reciprocal network. The network uses joint loss functions to alternately train forward and inverse models, which makes the two models subject to cycle constraints and perform better with the help of each other. Experiment results demonstrate the performance of the model is improved on three popular benchmarks and set a new state-of-the-art on the benchmark of PF-WILLOW.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.