Crowding impedes the identification of flanked objects in peripheral vision. Prior studies have shown crowding strength decreases with target-flanker similarity. Research on crowding in Chinese-character recognition has been scarce in the literature. We aimed to fill the research gap by examining the effects of structural similarity on Chinese-character crowding. Regularity in within-character configuration, i.e., orthographic legality, of flankers was manipulated in Experiment 1. Target-flanker similarity in orthographic legality did not affect crowding strength, measured as contrast threshold elevation. Crowding weakened only when the strokes in the flankers were scrambled. Contour integrity of flankers was manipulated by randomly perturbing the phase spectra of the stimulus images in Experiments 2a and 2b. Crowding by perturbed-phase flankers remained robust but was weaker compared with intact-phase flankers. Target-flanker similarity in contour integrity modulated crowding strength. Our findings were consistent with the postulation that faulty integration of low-level visual features contributed to crowding of Chinese characters. Studies on Chinese-character recognition and crowding can provide important insights into how the visual system processes complex daily objects.
Absolute pitch (AP) refers to the naming of musical tone without external reference. The influential two‐component model states that AP is limited by the late‐emerging pitch labeling process only and not the earlier perceptual and memory processes. Over the years, however, support for this model at the neural level has been mixed with various methodological limitations. Here, the electroencephalography responses of 27 AP possessors and 27 non‐AP possessors were recorded. During both name verification and passive listening, event‐related potential analyses showed a difference between AP and non‐AP possessors at about 200 ms in their response toward tones compared with noise stimuli. Multivariate pattern analyses suggested that pitch naming was subserved by a series of transient processes for the first 250 ms, followed by a stage‐like process for both AP and non‐AP possessors with no group differences between them. These findings are inconsistent with the predictions of the two‐component model, and instead suggest the existence of an early perceptual locus of AP.
At which phase(s) does task demand affect object processing? Previous studies showed that task demand affects object representations in higher-level visual areas but not so much in earlier areas. There are, however, limitations in those studies concerning the relatively weak manipulation of task due to the use of familiar real-life objects, and/or the low temporal resolution in brain activation measures such as fMRI. In the current study, observers categorized images of artificial objects in one of two orthogonal dimensions, shape and texture. Electroencephalogram (EEG), a technique with higher temporal resolution, and multivariate pattern analysis (MVPA) were employed to reveal object processing across time under different task demands. Results showed that object processing along the task-relevant dimension was enhanced starting from a relatively late time (~230ms after image onset), within the time range of the event-related potential (ERP) components N170 and N250. The findings are consistent with the view that task exerts an effect on object processing at the later phases of processing in the ventral visual pathway.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.