How do we perceive a stable world in the face of noisy input?
How do we predict what will happen next as a scene unfolds?
How do we isolate the key contents of a scene from surrounding distractions?
Sensory input to the brain is noisy and discontinuous, yet our experience of the world is stable and fluid over time. Imagine watching a road sign through pouring rain, intermittently occluded by passing cars and obscured by droplets collecting on the windshield. How is it, when the image of the sign that reaches your eyes is constantly changing, that the sign itself looks the same from one moment to the next? We study how the human sensory systems use input from the recent past to reconstruct the present scene, promoting fluid perception over time.
In Fischer & Whitney, Nature Neuroscience, 2014, we found that perception in the present moment is biased toward visual input from the recent past. Serial dependence in perception may stabilize our visual experience in the face of noise.
Understanding and interacting with the visual world requires more than identifying "what is where". To engage with a scene, we must understand its physical structure – how the constituent objects rest on and support each other, how much force would be required to move them, and how they will behave when they fall, roll, or collide. Work in our lab seeks to characterize people's physical inference abilities and identify the neural resources we use to interpret and predict the physical events in a scene.
At a given moment, only a fraction of the vast information in a scene is crucial to the task at hand; the remainder can be distracting. A fundamental goal in scene understanding is therefore to isolate the key objects from the clutter, suppressing the influence of irrelevant input on perception. Our lab studies how the brain’s attentional system filters out distractors, and how this filtering enhances the processing of important, attended objects.
In Fischer & Whitney, Nature Communications, 2012, we found that visual maps in the pulvinar nuclei of the thalamus gate out distracting input, selectively representing attended visual information. This unique mapping may help shape visual responses in the cortex, biasing competition toward important objects and away from distractions.
In Fischer, Mikhael, Tenenbaum, & Kanwisher, PNAS, 2016, we uncovered a set of brain regions that are recruited when people observe physical events and predict their outcomes. This "physics engine" in the brain falls within brain areas previously implicated in action planning and tool use, suggesting an intimate connection between physical scene understanding and motor planning in the brain.