Fig. 1. (a) An overview of our method, given a few reference samples (5 for English or 30 for Chinese), glyph images of all other characters in the same style can be synthesized. (b) Examples of synthesized English/Chinese glyph images obtained by our proposed AGIS-Net, MC-GAN [Azadi et al. 2018] and TET-GAN [Yang et al. 2019], respectively, please zoom in for better inspection.Automatic generation of artistic glyph images is a challenging task that attracts many research interests. Previous methods either are specifically designed for shape synthesis or focus on texture transfer. In this paper, we propose a novel model, AGIS-Net, to transfer both shape and texture styles in one-stage with only a few stylized samples. To achieve this goal, we first disentangle the representations for content and style by using two encoders, ensuring the multi-content and multi-style generation. Then we utilize two collaboratively working decoders to generate the glyph shape image and its texture image simultaneously. In addition, we introduce a local texture refinement loss to further improve the quality of the synthesized textures. In this manner, our one-stage model is much more efficient and effective than other multi-stage stacked methods. We also propose a large-scale dataset with Chinese glyph images in various shape and texture styles, rendered from 35 professional-designed artistic fonts with 7,326 characters and 2,460 synthetic artistic fonts with 639 characters, to validate the effectiveness and extendability of our method. Extensive experiments on both English and Chinese artistic glyph image datasets demonstrate the superiority of our model in generating high-quality stylized glyph images against other state-of-the-art methods.
Interleaving experiments are an attractive methodology for evaluating retrieval functions through implicit feedback. Designed as a blind and unbiased test for eliciting a preference between two retrieval functions, an interleaved ranking of the results of two retrieval functions is presented to the users. It is then observed whether the users click more on results from one retrieval function or the other. While it was shown that such interleaving experiments reliably identify the better of the two retrieval functions, the naive approach of counting all clicks equally leads to a suboptimal test. We present new methods for learning how to score different types of clicks so that the resulting test statistic optimizes the statistical power of the experiment. This can lead to substantial savings in the amount of data required for reaching a target confidence level. Our methods are evaluated on an operational search engine over a collection of scientific articles.
We offer and test a simple operationalization of hedonic and eudaimonic well-being (“happiness”) as mediating variables that link outcomes to motivation. In six evolutionary agent-based simulation experiments, we compared the relative performance of agents endowed with different combinations of happiness-related traits (parameter values), under four types of environmental conditions. We found (i) that the effects of attaching more weight to longer-term than to momentary happiness and of extending the memory for past happiness are both stronger in an environment where food is scarce; (ii) that in such an environment “relative consumption,” in which the agent’s well-being is negatively affected by that of its neighbors, is more detrimental to survival when food is scarce; and (iii) that having a positive outlook, under which agents’ longer-term happiness is increased by positive events more than it is decreased by negative ones, is generally advantageous.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.