We show that synesthesia is sensory and automatic in nature: the pupil scales with the brightness of experienced synesthetic colors. doi.org/10.7554/eLif...
Now in its new dress @elife.bsky.social (convincing & valuable in round 1).
If anyone wants to pick up the method, happy to share & explain!
07.03.2026 07:58
π 75
π 22
π¬ 4
π 0
PNAS
Proceedings of the National Academy of Sciences (PNAS), a peer reviewed journal of the National Academy of Sciences (NAS) - an authoritative source of high-impact, original research that broadly spans...
@attentionlab.bsky.social identified (what look like) #AI bots in an online response-time task (Posner cuing). Give-aways are normally distributed RTs and lack of serial-dependence effects. Pretty troublng. www.pnas.org/doi/10.1073/...
25.02.2026 11:48
π 13
π 3
π¬ 1
π 0
Recent work has shown how vulnerable online survey research is to LLMs. Motivated by this, we examined our online Posner cueing data from Prolific. It's concerning. We now must carefully consider when (or whether?) online behavioral data can be trusted.
see our comment:
www.pnas.org/doi/10.1073/...
19.02.2026 12:00
π 76
π 34
π¬ 6
π 4
Synesthetes claim sensory experiences, such as seeing color when reading or hearing a (black) number. β¨But how genuine are these reports and sensations? We introduce a rather direct measure of synesthetic perception: Synesthetesβ pupils respond to evoked color as if it was real color #vision! ποΈπ¨π§ͺ
26.11.2025 16:40
π 69
π 26
π¬ 2
π 8
Our commentary @stigchel.bsky.social on Ruth Rosenholtz' Visual Attention in Crisis paper is now available:
doi.org/10.1017/S014...
We argue that effort must be considered when aiming to quantify capacity limits or a task's complexity.
26.11.2025 13:40
π 9
π 4
π¬ 1
π 0
OSF
Planning on running a RIFT study? In a new manuscript, we put together the RIFT know-how accumulated over the years by multiple labs (@lindadrijvers.bsky.social, @schota.bsky.social, @eelkespaak.bsky.social, with CecΓlia HustΓ‘ and others).
Preprint: osf.io/preprints/ps...
29.10.2025 10:52
π 22
π 8
π¬ 1
π 1
Filled with a bunch of extra analyses, this is now accepted in The Journal of Neuroscience @sfn.org! You can have a sneak peak here: www.biorxiv.org/content/10.1...
24.10.2025 08:35
π 14
π 2
π¬ 1
π 1
Spatial attention and working memory are popularly thought to be tightly coupled. Yet, distinct neural activity tracks attentional breadth and WM load.
In a new paper @jocn.bsky.social, we show that pupil size independently tracks breadth and load.
doi.org/10.1162/JOCN...
14.10.2025 14:04
π 36
π 15
π¬ 1
π 1
Very happy to see this preprint out! The amazing @danwang7.bsky.social was on fire sharing this work at #ECVP2025, gathering loads of attention, and here you can find the whole thing!
Using RIFT we reveal how the competition between top-down goals and bottom-up saliency unfolds within visual cortex.
28.08.2025 10:09
π 12
π 4
π¬ 0
π 0
I'll show some (I think) cool stuff about how we can measure the phenomenology of synesthesia in a physiological way at #ECVP - Color II, atrium maximum, 9:15, Thursday.
say hi and show your colleagues that you're one of the dedicated ones by getting up early on the last day!
27.08.2025 21:13
π 9
π 1
π¬ 0
π 0
And now without bluesky making the background black...
24.08.2025 20:57
π 16
π 6
π¬ 0
π 0
#ECVP2025 starts with a fully packed room!
I'll show data, demonstrating that synesthetic perception is perceptual, automatic, and effortless.
Join my talk (Thursday, early morning.., Color II) to learn how the qualia of synesthesia can be inferred from pupil size.
Join and (or) say hi!
24.08.2025 16:28
π 18
π 3
π¬ 1
π 1
Excited to give a talk at #ECVP2025 (Tuesday morning, Attention II) on how spatially biased attention during VWM does not boost excitability the same way it does when attending the external world, using Rapid Invisible Frequency Tagging (RIFT). @attentionlab.bsky.social @ecvp.bsky.social
24.08.2025 13:13
π 14
π 3
π¬ 0
π 1
Excited to share that Iβll be presenting my poster at #ECVP2025 on August 26th (afternoon session)!
π§ β¨ Our work focused on dynamic competition between bottom-up saliency and top-down goals in early visual cortex by using Rapid Invisible Frequency Tagging
@attentionlab.bsky.social @ecvp.bsky.social
24.08.2025 13:28
π 9
π 3
π¬ 0
π 0
data saturation for gaze heatmaps. Initially, any additional participant will bring the total NSS or AUC as measures for heatmap similarity a lot closer to the full sample. However, the returns diminish increasingly at higher n.
Gaze heatmaps (are popular especially for eye-tracking beginners and in many applied domains. How many participants should be tested?
Depends of course, but our guidelines help navigating this in an informed way.
Out now in BRM (free) doi.org/10.3758/s134...
@psychonomicsociety.bsky.social
29.07.2025 07:37
π 9
π 2
π¬ 1
π 0
Thrilled to share that I successfully defended my PhD dissertation on Monday June 16th!
The dissertation is available here: doi.org/10.33540/2960
18.06.2025 14:21
π 16
π 2
π¬ 3
π 1
Now published in Attention, Perception & Psychophysics @psychonomicsociety.bsky.social
Open Access link: doi.org/10.3758/s134...
12.06.2025 07:21
π 14
π 8
π¬ 0
π 0
And Monday morning:
@suryagayet.bsky.social
has a poster (pavilion) on:
Feature Integration Theory revisited: attention is not needed to bind stimulus features, but prevents them from falling apart.
Happy @vssmtg.bsky.social #VSS2025 everyone, enjoy the meeting and the very nice coffee mugs!
18.05.2025 09:56
π 6
π 0
π¬ 0
π 0
@vssmtg.bsky.social
presentations today!
R2, 15:00
@chrispaffen.bsky.social:
Functional processing asymmetries between nasal and temporal hemifields during interocular conflict
R1, 17:15
@dkoevoet.bsky.social:
Sharper Spatially-Tuned Neural Activity in Preparatory Overt than in Covert Attention
18.05.2025 09:41
π 8
π 4
π¬ 1
π 0
and tomorrow, Monday:
Surya Gayet in the Pavilion in the morning session:
Feature Integration Theory revisited: attention is not needed to bind stimulus features, but prevents them from falling apart.
Enjoy VSS everyone!
18.05.2025 09:40
π 0
π 0
π¬ 0
π 0
We previously showed that affordable eye movements are preferred over costly ones. What happens when salience comes into play?
In our new paper, we show that even when salience attracts gaze, costs remain a driver of saccade selection.
OA paper here:
doi.org/10.3758/s134...
16.05.2025 13:36
π 9
π 4
π¬ 1
π 0
Preparing overt eye movements and directing covert attention are neurally coupled. Yet, this coupling breaks down at the single-cell level. What about populations of neurons?
We show: EEG decoding dissociates preparatory overt from covert attention at the population level:
doi.org/10.1101/2025...
13.05.2025 07:51
π 18
π 9
π¬ 2
π 4
In our latest paper @elife.bsky.social we show that we choose to move our eyes based on effort minimization. Put simply, we prefer affordable over more costly eye movements.
eLife's digest:
elifesciences.org/digests/9776...
The paper:
elifesciences.org/articles/97760
#VisionScience
08.04.2025 08:06
π 13
π 4
π¬ 1
π 2
Heat map of gaze locations overlaid on top of a feature-rich collage image. There is a seascape with a kitesurfer, mermaid, turtle, and more.
New preprint!
We present two very large eye tracking datasets of museum visitors (4-81 y.o.!) who freeviewed (n=1248) or searched for a +/x (n=2827) in a single feature-rich image.
We invite you to (re)use the dataset and provide suggestions for future versions π
osf.io/preprints/os...
28.03.2025 09:34
π 23
π 6
π¬ 2
π 1
<em>Psychophysiology</em> | SPR Journal | Wiley Online Library
Previous studies have shown that the pupillary light response (PLR) can physiologically index covert attention, but only with highly simplistic stimuli. With a newly introduced technique that models ....
Out in Psychophysiology (OA):
Typically, pupillometry struggles with complex stimuli. We introduced a method to study covert attention allocation in complex video stimuli -
effects of top-down attention, bottom-up attention, and pseudoneglect could all be recovered.
doi.org/10.1111/psyp.70036
21.03.2025 14:53
π 23
π 6
π¬ 2
π 0
Congrats to Luzi @luzixu.bsky.social! We're very proud of you! π
19.03.2025 13:01
π 4
π 1
π¬ 0
π 0
<em>Psychophysiology</em> | SPR Journal | Wiley Online Library
Dominant theories posit that attentional shifts prior to saccades enable a stable visual experience despite abrupt changes in visual input caused by saccades. However, recent work may challenge this ...
Presaccadic attention facilitates visual continuity across eye movements. However, recent work may suggest that presaccadic attention doesn't shift upward. What's going on?
Our paper shows that presaccadic attention moves up- and downward using the pupil light response.
doi.org/10.1111/psyp.70047
19.03.2025 08:28
π 8
π 2
π¬ 1
π 0