University of Cambridge > Talks.cam > Rainbow Group Seminars > Psychophysical tests of human visual encoding models

Psychophysical tests of human visual encoding models

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact Rafal Mantiuk.

The human visual system compresses the information about the world implicit in the light entering our eyes. Decades of research in vision science has provided good hypotheses for the features that are encoded by the early visual system and made available for cognition and action. One approach to testing these hypotheses uses analysis by synthesis: one can generate artificial image stimuli that should differentiate competing encoding accounts, or for which an encoding account makes a strong prediction about discriminability. A classical example from vision is colour metamerism. Two spectrally-distinct surfaces will appear to be the same colour as long as the ratios of cone activations are identical (and context is comparable). I will present work extending this concept to the discriminability of photographic scenes. I will show examples from past work in which we used this logic to psychophysically test a popular analogy for vision in the periphery, as that of a “texture-like” representation. We find two extant models fail to adequately capture image discriminability, and we speculate about what ingredients might be missing. Ongoing work extends this using a data-driven approach, and expands to test other models. Overall, classical psychophysical methods combined with hypotheses from vision science and modern tools in image synthesis provide a powerful approach to test the functional encoding of visual information.

Zoom link: https://cam-ac-uk.zoom.us/j/84318599913?pwd=WmxmYXpMSCtzeG0rakdaZzZ6Z2R5dz09

This talk is part of the Rainbow Group Seminars series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

© 2006-2024 Talks.cam, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity