Despite general agreement that prediction is a central aspect of perception, there is relatively little evidence concerning the basis on which visual predictions are made. Although both saccadic and pursuit eye-movements reveal knowledge of the future position of a moving visual target, in many of these studies targets move along simple trajectories through a frontoparallel plane. Here, using a naturalistic and racquet-based interception task in a virtual environment, we demonstrate that subjects make accurate predictions of visual target motion, even when targets follow trajectories determined by the complex dynamics of physical interactions and the head and body are unrestrained. Furthermore, we found that, following a change in ball elasticity, subjects were able to accurately adjust their prebounce predictions of the ball's post-bounce trajectory. This suggests that prediction is guided by experience-based models of how information in the visual image will change over time.
Visual quantification of parasitemia in thin blood films is a very tedious, subjective and time-consuming task. This study presents an original method for quantification and classification of erythrocytes in stained thin blood films infected with Plasmodium falciparum. The proposed approach is composed of three main phases: a preprocessing step, which corrects luminance differences. A segmentation step that uses the normalized RGB color space for classifying pixels either as erythrocyte or background followed by an Inclusion-Tree representation that structures the pixel information into objects, from which erythrocytes are found. Finally, a two step classification process identifies infected erythrocytes and differentiates the infection stage, using a trained bank of classifiers. Additionally, user intervention is allowed when the approach cannot make a proper decision. Four hundred fifty malaria images were used for training and evaluating the method. Automatic identification of infected erythrocytes showed a specificity of 99.7% and a sensitivity of 94%. The infection stage was determined with an average sensitivity of 78.8% and average specificity of 91.2%.
In addition to stimulus properties and task factors, memory is an important determinant of the allocation of attention and gaze in the natural world. One way that the role of memory is revealed is by predictive eye movements. Both smooth pursuit and saccadic eye movements demonstrate predictive effects based on previous experience. We have previously shown that unskilled subjects make highly accurate predictive saccades to the anticipated location of a ball prior to a bounce in a virtual racquetball setting. In this experiment, we examined this predictive behaviour. We asked whether the period after the bounce provides subjects with visual information about the ball trajectory that is used to programme the pursuit movement initiated when the ball passes through the fixation point. We occluded a 100 ms period of the ball's trajectory immediately after the bounce, and found very little effect on the subsequent pursuit movement. Subjects did not appear to modify their strategy to prolong the fixation. Neither were we able to find an effect on interception performance. Thus, it is possible that the occluded trajectory information is not critical for subsequent pursuit, and subjects may use an estimate of the ball's trajectory to programme pursuit. These results provide further support for the role of memory in eye movements.
People can often anticipate the outcome of another person's actions based on visual information available in the movements of the other person's body. We investigated this problem by studying how goalkeepers anticipate the direction of a penalty kick in soccer. The specific aim was to determine whether the information used to anticipate kick direction is best characterized as local to a particular body segment or distributed across multiple segments. In Experiment 1, we recorded the movements of soccer players as they kicked balls into a net. Using a novel method for analyzing motion capture data, we identified sources of local and distributed information that were reliable indicators of kick direction. In Experiments 2 and 3, subjects were presented with animations of kickers' movements prior to foot-to-ball contact and instructed to judge kick direction. Judgments were consistent with the use of distributed information, with a possible small contribution of local information.
The interaction between the vestibular and ocular system has primarily been studied in controlled environments. Consequently, off-the shelf tools for categorization of gaze events (e.g. fixations, pursuits, saccade) fail when head movements are allowed. Our approach was to collect a novel, naturalistic, and multimodal dataset of eye+head movements when subjects performed everyday tasks while wearing a mobile eye tracker equipped with an inertial measurement unit and a 3D stereo camera. This Gaze-in-the-Wild dataset (GW) includes eye+head rotational velocities (deg/s), infrared eye images and scene imagery (RGB+D). A portion was labelled by coders into gaze motion events with a mutual agreement of 0.72 sample based Cohen's κ. This labelled data was used to train and evaluate two machine learning algorithms, Random Forest and a Recurrent Neural Network model, for gaze event classification. Assessment involved the application of established and novel event based performance metrics. Classifiers achieve ∼90% human performance in detecting fixations and saccades but fall short (60%) on detecting pursuit movements. Moreover, pursuit classification is far worse in the absence of head movement information. A subsequent analysis of feature significance in our best-performing model revealed a reliance upon absolute eye and head velocity, indicating that classification does not require spatial alignment of the head and eye tracking coordinate systems. The GW dataset, trained classifiers and evaluation metrics will be made publicly available with the intention of facilitating growth in the emerging area of head-free gaze event classification. arXiv:1905.13146v1 [cs.CV] 9 May 2019 (see Section 3). We then use this labelled data for supervised training and assessment of automated event detectors.This work builds upon a variety of techniques previously used to track head orientation during natural behavior. Published studies have demonstrated the use of rotational potentiometers and accelerometers, 8 magnetic coils, 12 or motion capture 13 for the sensing of head orientation. 6 Perhaps the highest precision eye+head tracker which allowed body movement leveraged a 5.8 m 3 custom-made armature capable of generating a pulsing magnetic field. The subject was outfitted with a head-worn receiver capable of measuring head position and orientation within its operational region. 14 Several systems have adopted video based head motion compensation 10, 15 and demonstrated promising results, but are too computationally expensive for real-time use, and are prone to irrecoverable track loss especially during periods of rapid head movement, occlusion of tracking features or degration of image quality due to motion blur. Recent approaches have involved the use of head-mounted IMUs. For example, Larsson et al. used a head-mounted IMU in a study where subjects were asked to perform visual tracking tasks when watching pre-rendered stimuli projected onto a 2D screen. 16 They established that compensating for head movements results in a reduced sta...
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.