Olaf Dimigen's Avatar

Olaf Dimigen

@olaf.dimigen.de

Trying to understand how the brain makes sense of the world with (and despite) eye movements. Active visual cognition, Combined eye-tracking/EEG, EEG methods. Toolboxes: EYE-EEG, opticat, UNFOLD. Previously @Berlin. Tenured Asst. professor @Groningen

629
Followers
713
Following
12
Posts
08.10.2023
Joined
Posts Following

Latest posts by Olaf Dimigen @olaf.dimigen.de

Trial scheme (horizontal vs. vertical reading) and fixation-related potential results for readers from Taiwan vs. Mainland China

Trial scheme (horizontal vs. vertical reading) and fixation-related potential results for readers from Taiwan vs. Mainland China

Everyday visual experience tunes neural processing. Using fixation-related EEG, this new work shows how the N1 preview benefit depends on Chinese readers' prior experience with left-to-right vs. up-down reading. @umaurer.bsky.social

www.authorea.com/doi/full/10....

24.11.2025 12:25 πŸ‘ 3 πŸ” 1 πŸ’¬ 0 πŸ“Œ 0

This was not the case. Instead, results suggest that both magno- and parvo-biased information contributes to early, left-lateralized neural processes underlying visual word recognition. (2/2)

Available in Neurobiology of Language: direct.mit.edu/nol/article-...

@umaurer.bsky.social

17.11.2025 08:52 πŸ‘ 1 πŸ” 0 πŸ’¬ 0 πŸ“Œ 0
The figure shows exemplary stimuli in experiments 1 and 2, that were biased for M- vs. P-pathway processing by spatial filtering (exp 1) or by isoluminance (exp 2).

The figure shows exemplary stimuli in experiments 1 and 2, that were biased for M- vs. P-pathway processing by spatial filtering (exp 1) or by isoluminance (exp 2).

Happy to be a collaborator on this new work by first-author Xin Huang from Urs Maurer's lab. In two combined eye-tracking/EEG experiments, we asked whether rapidly processed magnocellular (M-pathway) information in parafoveal vision plays a special role for word recognition in natural reading. (1/2)

17.11.2025 08:39 πŸ‘ 6 πŸ” 2 πŸ’¬ 2 πŸ“Œ 0
Preview
Tracking attention using RIFT with a consumer-monitor setup Rapid Invisible Frequency Tagging (RIFT) is a recent technique that extends the traditional frequency tagging approach by stimulating at frequencies beyond the threshold of perception (β‰₯60Hz). By doin...

Together with recent work by @olaf.dimigen.de on using RIFT with a monitor setup, as well as our own new preprint on using a monitor setup to track attention (www.biorxiv.org/content/10.1...), RIFT is now a lot more accessible both in terms of available recommendations and materials.

29.10.2025 10:52 πŸ‘ 4 πŸ” 1 πŸ’¬ 0 πŸ“Œ 0

Launched in 2023, Imaging Neuroscience is now firmly established, with full indexing (PubMed, etc.) and 700 papers to date.

We're very happy to announce that we are able to reduce the APC to $1400.

Huge thanks to all authors, reviewers, editorial team+board, and MIT Press.

05.09.2025 02:59 πŸ‘ 233 πŸ” 80 πŸ’¬ 2 πŸ“Œ 6

Academic authors, here's a peek into the black box of journal publishing from an journal editor if you can bear it:

06.09.2025 23:09 πŸ‘ 1003 πŸ” 473 πŸ’¬ 18 πŸ“Œ 105
Post image

🚨WHOHOO!! I am happy to share that I received the #ERCStG for my project PRECHRON: The Prefrontal Chronometer for Organizing Working Memory.

I am going to study #neuraloscillations during #workingmemory at the @rug.nl @rug-gmw.bsky.social

#brainstimulation #TMS #EEG

04.09.2025 11:54 πŸ‘ 31 πŸ” 9 πŸ’¬ 5 πŸ“Œ 1

I do :)

26.08.2025 18:00 πŸ‘ 1 πŸ” 0 πŸ’¬ 0 πŸ“Œ 0

Yes, sure, also significant, but wouldn't you agree that lateralization still looks pretty weak (cf. Fig. 8)? But midline periphery (similar to our -12Β°) is also among weakest signal locations (at least if stimuli not scaled w/ cortical magnif.), so it's probably also a SNR issue in our data.

22.08.2025 15:49 πŸ‘ 0 πŸ” 0 πŸ’¬ 1 πŸ“Œ 0

1) May seem surprising, but very much in line with MEG. See Figs 6 & 8 in Minarik et al., 2023, NeuroImage who mapped 15 VF locations.
2) The pilot described here was run with DC-EOG, but without ET; that's part of an ongoing follow-up

22.08.2025 14:12 πŸ‘ 1 πŸ” 0 πŸ’¬ 1 πŸ“Œ 0
Part of the main results figure

Part of the main results figure

22.08.2025 11:58 πŸ‘ 4 πŸ” 1 πŸ’¬ 0 πŸ“Œ 0

Our small pilot study shows RIFT works with an affordable 480 Hz OLED monitor & EEG:
βœ… Reliable timing
βœ… Robust tagging at 60 & 64 Hz to barely visible flicker
βœ… Even weak peripheral responses

We hope this opens the door to RIFT studies by more labs.

(with Ioana Badea, Iarina Simon & Mark M. Span)

22.08.2025 11:52 πŸ‘ 6 πŸ” 0 πŸ’¬ 0 πŸ“Œ 0
Preview
Rapid Invisible Frequency Tagging (RIFT) with a consumer monitor: A proof-of-concept Rapid Invisible Frequency Tagging (RIFT) enables neural frequency tagging at rates above the flicker fusion threshold, eliciting steady-state responses to flicker that is almost imperceptible. While R...

🚨 New preprint: Invisible neural frequency tagging (RIFT) for the underfunded researcher:
πŸ‘‰ www.biorxiv.org/cgi/content/...

RIFT uses high-frequency flicker to probe attention in M/EEG with minimal stimulus visibility and little distraction. Until now, it required a costly high-speed projector.

22.08.2025 11:52 πŸ‘ 33 πŸ” 17 πŸ’¬ 4 πŸ“Œ 2

Interested in pupillometry, eye movements, visual attention, visual working memory, or related topics? Apply with our group in in beautiful, livable, and friendly Groningen! Reach out to @elkanakyurek.bsky.social, @van-rijn.org, @olaf.dimigen.de, @miles2708.bsky.social or myself to explore options!

09.05.2025 09:25 πŸ‘ 16 πŸ” 12 πŸ’¬ 2 πŸ“Œ 0
Post image

Instead of listing my publications, as the year draws to an end, I want to shine the spotlight on the commonplace assumption that productivity must always increase. Good research is disruptive and thinking time is central to high quality scholarship and necessary for disruptive research.

20.12.2024 11:18 πŸ‘ 1151 πŸ” 375 πŸ’¬ 21 πŸ“Œ 57

I am a bit afraid the whole Starter Pack business leaves behind trainees β€” make sure to follow / repost them if their work interests you so their voices are heard #PsychSciSky #neuroskyence #compneurosky

21.11.2024 07:45 πŸ‘ 161 πŸ” 42 πŸ’¬ 5 πŸ“Œ 4
Recipients 2024 – Einstein Foundation Award

Einstein Foundation awards to Elisabeth Bik and Pubpeer. πŸ’ͺ award.einsteinfoundation.de/award-winner...

18.11.2024 09:58 πŸ‘ 223 πŸ” 43 πŸ’¬ 5 πŸ“Œ 4

Thanks, will do!

01.11.2024 20:19 πŸ‘ 0 πŸ” 0 πŸ’¬ 1 πŸ“Œ 0

Hi Martin, the preprint refers to "Supporting Information" that seems to be not in the document. Is the full version available somewhere?

01.11.2024 19:51 πŸ‘ 0 πŸ” 0 πŸ’¬ 1 πŸ“Œ 0
Preview
A high-speed OLED monitor for precise stimulation in vision, eye-tracking, and EEG research The recent introduction of organic light-emitting diode (OLED) monitors with refresh rates of 240 Hz or more opens new possibilities for their use as precise stimulation devices in vision research, ex...

Are you looking for a precise monitor for vision science, eye-tracking, or EEG? In a new preprint (w/ Arne Stein), we tested a new type of display with excellent performance in time-critical experiments: "High-speed" (240 Hz) OLED monitors. All details here: www.biorxiv.org/content/10.1...

04.10.2024 14:26 πŸ‘ 5 πŸ” 2 πŸ’¬ 0 πŸ“Œ 0