When two stationary visual objects appear in alternating sequence, they evoke the perception of a single object moving back and forth between them. This is known as stroboscopic or apparent motion and forms the basis of perceived continuity in, for example, motion pictures. When the spatiotemporal separation between the inducing objects is optimal, the subjective appearance of apparent motion is nearly indistinguishable from that of real motion. Here we report that the detection and identification of a simple visual form in the path of apparent motion is impaired by the illusory perception of an object moving through the empty space between the locations at which the inducing objects are presented. This observation may be a manifestation of perceptual completion or 'filling in' during apparent motion perception. We propose that feedback from higher to lower visual cortical areas activates an explicit neural representation of a moving object, which can then disrupt the representation of visual stimuli in the path of the movement.
Most previous studies of the sorting algorithm QuickSort have used the number of key comparisons as a measure of the cost of executing the algorithm. Here we suppose that the n independent and identically distributed (iid) keys are each represented as a sequence of symbols from a probabilistic source and that QuickSort operates on individual symbols, and we measure the execution cost as the number of symbol comparisons. Assuming only a mild "tameness" condition on the source, we show that there is a limiting distribution for the number of symbol comparisons after normalization: first centering by the mean and then dividing by n. Additionally, under a condition that grows more restrictive as p increases, we have convergence of moments of orders p and smaller. In particular, we have convergence in distribution and convergence of moments of every order whenever the source is memoryless, i.e., whenever each key is generated as an infinite string of iid symbols. This is somewhat surprising: Even for the classical model that each key is an iid string of unbiased ("fair") bits, the mean exhibits periodic fluctuations of order n.
In this study, we take a first step towards theoretically analyzing genetic algorithms (GAs) in noisy environments using Markov chain theory. We explicitly construct a Markov chain that models GAs applied to fitness functions perturbed by either additive or multiplicative noise that takes on finitely many values, and we analyze the chain to investigate the transition and convergence properties of the GAs. For the additive case, our analysis shows that GAs eventually (i.e., as the number of iterations goes to infinity) find at least one globally optimal solution with probability 1. In contrast, GAs may eventually with probability 1 fail to do so in the multiplicative case, and we establish a condition that is both necessary and sufficient for eventually finding a globally optimal solution. In addition, our analysis shows that the chain has a stationary distribution that is also its steady-state distribution. Based on this property, we derive an upper bound for the number of iterations sufficient to ensure with certain probability that a GA has reached the set of globally optimal solutions and continues to include in each subsequent population at least one globally optimal solution whose observed fitness value is greater than that of any suboptimal solution.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.