Material perception, texture, and peripheral vision
Description
Humans can identify materials and infer their properties, such as a hardwood floor or a smooth leather jacket, in a visual scene. Prior research shows that visual texture (a rich set of statistics computed on the image of the material) can play an important role in determining how humans and machines identify materials. We further investigate this idea, testing how well a prominent set of texture statistics (Portilla & Simoncelli, 2000) can be used to categorize images of materials, under large variations in viewpoint, scale, and structure. We find that there is large range in performance, depending on both material category and the details of the specific image.
Interestingly, recent models of peripheral vision (Balas, Nakano, & Rosenholtz, 2009) suggest a texture-like encoding. This might mean that visual textures are well represented by peripheral vision, and therefore it is natural to ask how well humans perceive materials in the periphery. Using a gaze-contingent display, we tested peripheral material categorization, and found again a large range in performance. Notably, there is modest but significant correlation between the peripheral performance and texture model predictions. I will discuss where the model and predictions differ, where they agree, speculate as to why. As this is mainly work in progress, I hope to get feedback on what might be interesting in future experiments and analyses.