Abstract
Information about color, material, texture and shape guide how we interact with objects. We developed a paradigm that quantifies how two object properties (color and material) combine in object selection. On each experimental trial, observers viewed three blob-shaped objects — the target and two tests — and selected the test that was more similar to the target. Across trials the target was fixed while the tests varied in color and material. We present a novel observer model that allows us to describe observers’ selection data in terms of (1) the underlying perceptual stimulus representation and (2) a color-material weight, which quantifies the relative importance of color vs. material in selection. We document large individual differences in the color-material weight. Furthermore, our analyses reveal limits on how precisely selection data simultaneously constrain perceptual representations and the color-material weight. These limits should guide future efforts towards understanding the multidimensional nature of object perception.