Objects in the natural world possess different visual attributes, including shape, colour, surface texture and motion. Previous perceptual studies have assumed that the brain analyses the colour of a surface independently of its three-dimensional shape and viewing geometry, although there are neural connections between colour and two-dimensional form processing early in the visual pathway. Here we show that colour perception is strongly influenced by three-dimensional shape perception in a novel, chromatic version of the Mach Card--a concave folded card with one side made of magenta paper and the other of white paper. The light reflected from the magenta paper casts a pinkish glow on the white side. The perceived colour of the white side changes from pale pink to deep magenta when the perceived shape of the card flips from concave to convex. The effect demonstrates that the human visual system incorporates knowledge of mutual illumination-the physics of light reflection between surfaces--at an early stage in colour perception.
The phenomenon of colour constancy in human visual perception keeps surface colours constant, despite changes in their reflected light due to changing illumination. Although colour constancy has evolved under a constrained subset of illuminations, it is unknown whether its underlying mechanisms, thought to involve multiple components from retina to cortex, are optimised for particular environmental variations. Here we demonstrate a new method for investigating colour constancy using illumination matching in real scenes which, unlike previous methods using surface matching and simulated scenes, allows testing of multiple, real illuminations. We use real scenes consisting of solid familiar or unfamiliar objects against uniform or variegated backgrounds and compare discrimination performance for typical illuminations from the daylight chromaticity locus (approximately blue-yellow) and atypical spectra from an orthogonal locus (approximately red-green, at correlated colour temperature 6700 K), all produced in real time by a 10-channel LED illuminator. We find that discrimination of illumination changes is poorer along the daylight locus than the atypical locus, and is poorest particularly for bluer illumination changes, demonstrating conversely that surface colour constancy is best for blue daylight illuminations. Illumination discrimination is also enhanced, and therefore colour constancy diminished, for uniform backgrounds, irrespective of the object type. These results are not explained by statistical properties of the scene signal changes at the retinal level. We conclude that high-level mechanisms of colour constancy are biased for the blue daylight illuminations and variegated backgrounds to which the human visual system has typically been exposed.
The computational problem underlying color vision is to recover the invariant surface-spectral-reflectance properties of an object. Lightness algorithms, which recover an approximation to surface reflectance in independent wavelength channels, have been proposed as one method to compute color. This paper clarifies and formalizes the lightness problem by proposing a new formulation of the intensity equation on which lightness algorithms are based and by identifying and discussing two basic subproblems of lightness and color computation: spatial decomposition and spectral normalization of the intensity signal. Several lightness algorithms are reviewed, and a new extension (the multiple-scales algorithm) of one of them is proposed. The main computational result is that each of the lightness algorithms may be derived from a single mathematical formula, under different conditions, which, in turn, imply limitations for the implementation of lightness algorithms by man or machine. In particular, the algorithms share certain limitations on their implementation that follow from the physical constraints imposed on the statement of the problem and the boundary conditions applied in its solution.
Cameras record three color responses (RG B) which are device dependent. Camera coordinates are mapped to a standard color space, such as XYZ-useful for color measurement-by a mapping function, e.g., the simple 3×3 linear transform (usually derived through regression). This mapping, which we will refer to as linear color correction (LCC), has been demonstrated to work well in the number of studies. However, it can map RG Bs to XYZs with high error. The advantage of the LCC is that it is independent of camera exposure. An alternative and potentially more powerful method for color correction is polynomial color correction (PCC). Here, the R, G, and B values at a pixel are extended by the polynomial terms. For a given calibration training set PCC can significantly reduce the colorimetric error. However, the PCC fit depends on exposure, i.e., as exposure changes the vector of polynomial components is altered in a nonlinear way which results in hue and saturation shifts. This paper proposes a new polynomial-type regression loosely related to the idea of fractional polynomials which we call root-PCC (RPCC). Our idea is to take each term in a polynomial expansion and take its kth root of each k-degree term. It is easy to show terms defined in this way scale with exposure. RPCC is a simple (low complexity) extension of LCC. The experiments presented in this paper demonstrate that RPCC enhances color correction performance on real and synthetic data.
A widely-viewed image of a dress elicits striking individual variation in colour perception. Experiments with multiple variants of the image suggest that the individual differences may arise through the action of visual mechanisms that normally stabilise object colour.
Adult colour preference has been summarized quantitatively in terms of weights on the two fundamental neural processes that underlie early colour encoding: the S-(L+M) ('blue-yellow') and L-M ('red-green') cone-opponent contrast channels (Ling, Hurlbert & Robinson, 2006; Hurlbert & Ling, 2007). Here, we investigate whether colour preference in 4-5-month-olds may be analysed in the same way. We recorded infants' eye-movements in response to pairwise presentations of eight colour stimuli varying only in hue. Infants looked longest at reddish and shortest at greenish hues. Analyses revealed that the L-M and S-(L+M) contrast between stimulus colour and background explained around half of the variation in infant preference across the hue spectrum. Unlike adult colour preference patterns, there was no evidence for sex differences in the weights on either of the cone-opponent contrast components. The findings provide a quantitative model of infant colour preference that summarizes variation in infant preference across hues.
The deficits in texture, motion, and depth perception incurred in monkeys at isoluminance were compared with the responses of neurons of the color-opponent and broad-band systems in the lateral geniculate nucleus. Texture perception, assumed to be carried by the color-opponent system, and motion and depth perception, ascribed to the broad-band pathway, were all found to be compromised but not abolished at isoluminance. Correspondingly, both the color-opponent and the broad-band systems were affected at isoluminance, but the activity of the neurons in neither system was abolished. These results suggest that impairment of visual capacities at isoluminance cannot be uniquely attributed to either of these systems and that isoluminant stimuli are inappropriate for the psychophysical isolation of these pathways.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.