Visual psychophysics and modeling of contextual effects in orientation and motion perception

TNT members involved in this project:
Nobody is listed for this project right now!
Show all

When we look, we see through our eyes an image of the world. Perception of its constituents is known to have hierarchical as well parallel streams of information processing, with higher levels performing more complex processing (e.g. biological motion - the perception of a walking human seems to be located in a middle-high brain area pooling information from different streams). The basic characteristics in a sequence of images, as for example orientation, spatial frequency, color, motion, etc., are already performed early in the visual pathway (area V1), and from these constituents the successive neuronal areas reconstruct an internal representation of the visual world, that is, the perceived world.

The project goal is to understand visual perception by humans of the two circular features 'orientation' (e.g. of a line segment) and 'direction of motion' (e.g. of a textured pattern), based on the prior knowledge that these features are coded and processed in the brain by computationally very similar neuronal structures (i.e. areas V1 for orientation and V5/MT for motion direction). Therefore, we are exploring the similarities and differences in the contextual effects on perception of orientation and motion direction, with the hopefully final aim of explaining perception of these two features by a unified theoretical account of neurons sensitive to circular features.

We base our computational modeling on the current knowledge of neuronal coding of the two features in areas V1 and V5/MT in humans and macaque monkeys. In parallel we perform psychophysical experiments for testing the predictive power of the modeling on the final perceptual outcome.