Abstract:Edit propagation is an appearance-editing method using sparsely provided edit strokes from users. Although edit propagation has a wide variety of applications, it is computationally complex, owing to the need to solve large linear systems. To reduce the computational cost, interpolation-based approaches have been studied intensely. This study is inspired by an interpolation-based edit-propagation method that uses a clustering algorithm to determine samples. The method uses an interpolant, which approximates ed… Show more
“…By reformulating edit propagation as a function interpolation problem in a high-dimensional feature space, Li et al [6] efficiently solved the problem using radial basis functions. Another interpolation based method, proposed by Yatagawa et al [11], approximated the edit parameters with convex combinations of samples, which can achieve a better accuracy in terms of colors and edit parameters. Another acceleration approach worth mentioning is the hierarchical data structure based method [12] which achieved scalable edit propagation.…”
“…By reformulating edit propagation as a function interpolation problem in a high-dimensional feature space, Li et al [6] efficiently solved the problem using radial basis functions. Another interpolation based method, proposed by Yatagawa et al [11], approximated the edit parameters with convex combinations of samples, which can achieve a better accuracy in terms of colors and edit parameters. Another acceleration approach worth mentioning is the hierarchical data structure based method [12] which achieved scalable edit propagation.…”
“…AppProp [AP08] yields better propagation by optimizing color differences for optimizing color differences not only between nearby pixels, but also between non‐neighboring ones. Computational efficiency has been also improved by using a kd‐tree [XLJ*09], continuously approximating feature space using radial basis functions (RBFs) [LJH10], manifold learning [MCY*13], efficient stroke sampling [BHW11], and sparse pixel sampling [YY15]. Most of the previous approaches share a common issue, namely that halo artifacts occur across object boundaries [LWA*12].…”
Section: Related Workmentioning
confidence: 99%
“…Many efforts have been made to attack the edit propagation problem [LLW04,LWCO * 07, LAA08, XLJ * 09,XWT * 09,LJH10,CZZT12,XYJ13,CZL * 14, YY15]. Various applications of edit propagation exist, such as grayscale image colorization, color image recoloring, segmentation and tone adjustment.…”
Section: Introductionmentioning
confidence: 99%
“…Various applications of edit propagation exist, such as grayscale image colorization, color image recoloring, segmentation and tone adjustment. Many efforts have been made to attack the edit propagation problem [LLW04, LWCO*07, LAA08, XLJ*09, XWT*09, LJH10, CZZT12, XYJ13, CZL*14, YY15]. Specifying sparse image edits, users can propagate the image edits to the entire image according to the propagation principle based on pixel similarity (e.g., proximities of positions, colors or textures).…”
Edit propagation is a technique that can propagate various image edits (e.g., colorization and recoloring) performed via user strokes to the entire image based on similarity of image features. In most previous work, users must manually determine the importance of each image feature (e.g., color, coordinates, and textures) in accordance with their needs and target images. We focus on representation learning that automatically learns feature representations only from user strokes in a single image instead of tuning existing features manually. To this end, this paper proposes an edit propagation method using a deep neural network (DNN). Our DNN, which consists of several layers such as convolutional layers and a feature combiner, extracts strokeadapted visual features and spatial features, and then adjusts the importance of them. We also develop a learning algorithm for our DNN that does not suffer from the vanishing gradient problem, and hence avoids falling into undesirable locally optimal solutions. We demonstrate that edit propagation with deep features, without manual feature tuning, can achieve better results than previous work.
“…The paper by Debattista et al [32] is about the compression of high dynamic range images. Yatagawa and Yamaguchi [33] present a method for the appearance editing for images. The paper by Hua and Wang [34] presents an image completion method and the last paper related to image processing is by Qiao et al [35]; they describe a technique to generate QR codes that are visually similar to an input image.…”
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.