2004
DOI: 10.1016/j.neunet.2004.02.007
|View full text |Cite
|
Sign up to set email alerts
|

A neuromorphic model for achromatic and chromatic surface representation of natural images

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
32
0

Year Published

2005
2005
2017
2017

Publication Types

Select...
6

Relationship

1
5

Authors

Journals

citations
Cited by 30 publications
(33 citation statements)
references
References 55 publications
0
32
0
Order By: Relevance
“…At the present, only few such models are available (e.g. Hong & Grossberg, 2004). Our contribution takes a step towards this direction.…”
Section: Discussionmentioning
confidence: 99%
See 3 more Smart Citations
“…At the present, only few such models are available (e.g. Hong & Grossberg, 2004). Our contribution takes a step towards this direction.…”
Section: Discussionmentioning
confidence: 99%
“…A standard technique is to band-pass filter the visual input over multiple scales, and subsequently adding luminance information to recover absolute luminance levels (e.g. Hong & Grossberg, 2004;Neumann, 1996;Pessoa et al, 1995). We instead propose a multiplexed retinal code, which is generated from contrast responses by modulating ON-cell (OFF-cell) responses with local brightness (darkness) (Fig.…”
Section: Recovering Absolute Levels Of Perceived Luminancementioning
confidence: 99%
See 2 more Smart Citations
“…Lighting conditions and surface context dramatically influence color perception. Preprocessing requires discounting the illuminant, perceptual grouping, surface filling-in and anchoring processes that have been simulated by other neural models, including lightness data (e.g., Grossberg & Todorovic 1988;Hong & Grossberg 2004;Mingolla et al 1999;Pessoa et al 1995). The authors' studies of color category learning and naming used inputs that do not represent challenges that autonomous robots would meet in the real world.…”
Section: Realistic Constraints On Brain Color Perception and Categorymentioning
confidence: 99%