New research sheds light on how human vision perceives scale – ScienceDaily


Researchers from Aston University and the University of York have discovered new insights into how the human brain makes perceptual judgments of the outside world.

The study was published on May 8 in the journal Plus oneexplore the computational mechanisms the human brain uses to perceive the size of objects in the world around us.

The research, led by Professor Tim Meese, in the School of Optometry at Aston University and Dr Daniel Baker in the Department of Psychology at the University of York, tells us how our visual system exploits ‘blur defocus’ to infer perceptual scale, but it does so in a crude way.

It is known that to derive the size of an object from the size of the retinal image, our visual system needs to estimate the distance to the object. An image of the retina contains many pictorial cues, such as linear perspective, that help the system derive the relative size of objects. However, to derive absolute volume, the system needs to know the spatial scale.

By taking into account defocus, such as blurry parts of the image outside the camera’s depth of focus, the visual system can achieve this. The math behind this has been well done by others, but the study begs the question: Does human vision exploit this math?

The research team presented participants with pairs of photographs of full-scale railway scenes subjected to several artificial blur treatments, and small-scale models of railway scenes taken with long exposure and small aperture to reduce focal noise. The task was to discover which image in each pair was an actual, full-size scene.

When the artificial blur was directed appropriately with the ground plane (the horizontal plane representing the ground on which the viewer is standing) in the full scenes, the participants were fooled into thinking that the small models were the full-sized scenes. Remarkably, this did not require the application of realistic gradients of blur. Simple uniform blur bands at the top and bottom of the images achieved roughly equivalent miniaturization effects.

Tim Mess, Professor of Vision Sciences at Aston University, said: “Our findings suggest that human vision can exploit focal blur to infer a perceptual metric, but it does so crudely – more metrically than analytically. Overall, our findings provide new insights into the computational mechanisms that drive The human brain uses them in perceptual judgments about the relationship between us and the outside world.”

Daniel Baker, Senior Lecturer in Psychology at the University of York, said: “These findings show that our perception of size is not perfect and can be affected by other characteristics of a scene. It also highlights the remarkable adaptability of the visual system. This may be relevant to understanding the computational principles underlying our perception to the world. For example, when judging the size and distance of hazards when driving.”


Source link

Related Posts