This will be the last talk session of #VSS2025! End the conference with some exciting piece of science
This will be the last talk session of #VSS2025! End the conference with some exciting piece of science
Tomorrow at 6 pm, I will be presenting my work on pre-saccadic remapping in the human visual cortex, revealed by voxel-wise encoding model on fMEI data (Talk 55.13 at Talk room 1). Please visit and share your thoughts π @juliedgolomb.bsky.social
@cognitionjournal.bsky.social
Read the full paper inΒ Cognition: doi.org/10.1016/j.co...
Open data/code:Β osf.io/f62dr
With: Tzu-Yao Chiu, Jake Ferreira, & Julie Golomb
π§ The takeaway: The brain actively employ the assumption of external world to aid stable perception across eye movements.
The result? By default, people assumed visual scenes stay stable across saccades - But when there's strong evidence of change, they scrutinize the trans-saccadic change more carefully.
In our latest paper, we combined theΒ blanking paradigmΒ with AI-generated scene wheel stimuli (π‘ Son et al., 2022) to test the underlying stability mechanism.
π¨ New publication alert! π¨
Ever wonder how we perceive a stable world despite constantly moving our eyes? π
We investigated how the brain maintains visual stability across eye movements in natural scenes.