Linda Drijvers's Avatar

Linda Drijvers

@lindadrijvers

cognitive neuroscientist studying brain oscillations, multimodal language comprehension & production | PI of Communicative Brain lab | assistant professor at Donders Institute & research group leader at MPI for Psycholinguistics - https://lindadrijvers.nl

442
Followers
400
Following
12
Posts
07.01.2025
Joined
Posts Following

Latest posts by Linda Drijvers @lindadrijvers

LinkedIn This link will take you to a page thatโ€™s not on LinkedIn

๐Ÿ“ฃ๐Ÿ“ฃ๐Ÿ“ฃJob alert Multimodal Language Department Max Planck Institute for Psycholinguistics MAX PLANCK RESEARCH GROUP LEADER POSITION (W2 BBESG) lnkd.in/eaq5MW9a

26.02.2026 20:32 ๐Ÿ‘ 17 ๐Ÿ” 20 ๐Ÿ’ฌ 1 ๐Ÿ“Œ 2

They just love spending time with you SO MUCH, they don't want to miss a minute.

04.12.2025 05:43 ๐Ÿ‘ 2 ๐Ÿ” 0 ๐Ÿ’ฌ 0 ๐Ÿ“Œ 0

Marijn Hafkamp, movement scientist and cognitive scientist, is looking for a next step in Academia (postdoc/junior faculty position), check out his work!

Also with @raphaelwerner @lindadrijvers.bsky.social @lucselen

01.12.2025 14:34 ๐Ÿ‘ 3 ๐Ÿ” 1 ๐Ÿ’ฌ 0 ๐Ÿ“Œ 0

Autistic adults, regardless of whether clinically dx or self-identifying, report more difficulty processing auditory information, particularly others' speech.
doi.org/10.1177/1362...

New research by Elena Silva, Linda Drivers
@lindadrijvers.bsky.social
and myself
(reposting with correct link)

19.11.2025 09:26 ๐Ÿ‘ 5 ๐Ÿ” 2 ๐Ÿ’ฌ 1 ๐Ÿ“Œ 0
Preview
Postdoc Position: Neural Circuitry Underlying Working Memory | Radboud University Do you want to work as a Postdoc Position: Neural Circuitry Underlying Working Memory at the Faculty of Social Sciences? Check our vacancy!

๐Ÿšจ๐Ÿง ๐Ÿšจ POSTDOC POSITION ๐Ÿšจ๐Ÿง ๐Ÿšจ in my lab: www.ru.nl/en/working-a... Connectivity in Working Memory. Deadline 12 Nov! Apply via website; please repost.

09.11.2025 09:17 ๐Ÿ‘ 37 ๐Ÿ” 38 ๐Ÿ’ฌ 1 ๐Ÿ“Œ 0
OSF

Planning on running a RIFT study? In a new manuscript, we put together the RIFT know-how accumulated over the years by multiple labs (@lindadrijvers.bsky.social, @schota.bsky.social, @eelkespaak.bsky.social, with Cecรญlia Hustรก and others).

Preprint: osf.io/preprints/ps...

29.10.2025 10:52 ๐Ÿ‘ 22 ๐Ÿ” 8 ๐Ÿ’ฌ 1 ๐Ÿ“Œ 1
IMPRS PhD Fellowships 2026 | Max Planck Institute

OPEN PHD POSITION - Come join our group! The IMPRS at @mpi-nl.bsky.social is offering an PhD position on modelling structured meaning in the brain, supervised by me and Helen De Hoop at the Centre for Language Studies in the @dondersinst.bsky.social -
www.mpi.nl/imprs-phd-fe... #NeuroJobs #cogsci

23.10.2025 09:05 ๐Ÿ‘ 37 ๐Ÿ” 28 ๐Ÿ’ฌ 0 ๐Ÿ“Œ 1
Preview
Regularization, Action, and Attractors in the Dynamical โ€œBayesianโ€ Brain Abstract. The idea that the brain is a probabilistic (Bayesian) inference machine, continuously trying to figure out the hidden causes of its inputs, has become very influential in cognitive (neuro)sc...

๐Ÿง  Regularization, Action, and Attractors in the Dynamical โ€œBayesianโ€ Brain

direct.mit.edu/jocn/article...

(still uncorrected proofs, but they should post the corrected one soon--also OA is forthcoming, for now PDF at brainandexperience.org/pdf/10.1162-...)

22.10.2025 08:59 ๐Ÿ‘ 29 ๐Ÿ” 12 ๐Ÿ’ฌ 2 ๐Ÿ“Œ 3
Post image Post image

The first publication of the #ERC project โ€˜LaDyโ€™ is a fact and itโ€™s an important one I think:

We show that word processing and meaning prediction is fundamentally different during social interaction compared to using language individually!
๐Ÿ‘€ short ๐Ÿงต/1

psycnet.apa.org/fulltext/202...
#OpenAccess

10.10.2025 17:12 ๐Ÿ‘ 36 ๐Ÿ” 9 ๐Ÿ’ฌ 4 ๐Ÿ“Œ 0
Preview
What Using Our Hands While Speaking Reveals About Our Brains Researchers now have fascinating insights into how our brains function while we communicate โ€” and our hand gestures hold the key.

Happy with this popular science article about our work (with @lindadrijvers.bsky.social and @judithholler.bsky.social), showing that listeners use co-speech hand gestures to predict upcoming meaning โœ‹ www.medscape.com/viewarticle/...

15.10.2025 08:12 ๐Ÿ‘ 11 ๐Ÿ” 5 ๐Ÿ’ฌ 1 ๐Ÿ“Œ 0
Post image

We're seeking the next Director of the Max Planck Institute for Psycholinguistics! Lead cutting-edge research in language & cognition. Nominations (incl. self) due 19 Dec 2025.
mpi.nl/career-education/vacancies/vacancy/nominations-and-self-nominations-sought-position-director-max

03.10.2025 07:53 ๐Ÿ‘ 40 ๐Ÿ” 45 ๐Ÿ’ฌ 0 ๐Ÿ“Œ 2
Preview
PhD Position: Accented Speech Processing | Radboud University Do you want to work as a PhD: Accented Speech Processing at the Faculty of Arts? Check our vacancy!

PhD Position: Accented Speech Processing - Apply now!

Come work with Mirjam Broersma, @davidpeeters.bsky.social, and me at the Centre for Language Studies, Radboud University in the Netherlands.

Application deadline: 19 October 2025

For more information, see
www.ru.nl/en/working-a...

02.10.2025 14:35 ๐Ÿ‘ 26 ๐Ÿ” 31 ๐Ÿ’ฌ 0 ๐Ÿ“Œ 0
EnvisionBOX overview2025
EnvisionBOX overview2025 YouTube video by Wim Pouw

www.envisionbox.org has been shortlisted for the Leo Waaijers Open Science price: ukb.nl/en/news/shor...

@babajideowoyele.bsky.social @jamestrujillo.bsky.social @sarkadava.bsky.social @DavideAhmar @acwiek.bsky.social

Amazing Markus Kรผpper made an animated video:
www.youtube.com/watch?v=HduI...

02.10.2025 12:28 ๐Ÿ‘ 18 ๐Ÿ” 11 ๐Ÿ’ฌ 2 ๐Ÿ“Œ 2
Preview
ERC Starting Grants for research into language, money circulation and medieval songs | Radboud University Three researchers at Radboud University will receive a Starting Grant from the European Research Council (ERC). They will receive a grant of rougly 1.5 million euros.

๐Ÿ‘€ ๐Ÿ‘‚ How does the brain merge what we hear & see? @lindadrijvers.bsky.social got an ERC Starting Grant (โ‰ˆ โ‚ฌ1.5M) for HANDWAVE, studying how we integrate audiovisual signals.

Vital for understanding language disorders & improving diagnostics.๐Ÿ‘‡

www.ru.nl/en/research/...

04.09.2025 14:55 ๐Ÿ‘ 19 ๐Ÿ” 5 ๐Ÿ’ฌ 0 ๐Ÿ“Œ 0
Post image

We are seeking a #Postdoctoral #Researcher to join our Multimodal Language Department for the NWO funded project:
๐†๐ซ๐จ๐ฎ๐ง๐๐ž๐ ๐†๐ž๐ฌ๐ญ๐ฎ๐ซ๐ž ๐†๐ž๐ง๐ž๐ซ๐š๐ญ๐ข๐จ๐ง ๐ข๐ง ๐‚๐จ๐ง๐ญ๐ž๐ฑ๐ญ:
๐Ž๐›๐ฃ๐ž๐œ๐ญ- ๐š๐ง๐ ๐ˆ๐ง๐ญ๐ž๐ซ๐š๐œ๐ญ๐ข๐จ๐ง-๐€๐ฐ๐š๐ซ๐ž ๐†๐ž๐ง๐ž๐ซ๐š๐ญ๐ข๐ฏ๐ž ๐€๐ˆ ๐Œ๐จ๐๐ž๐ฅ๐ฌ ๐จ๐Ÿ ๐‹๐š๐ง๐ ๐ฎ๐š๐ ๐ž ๐”๐ฌ๐ž
Start: ideally February 1, 2026 but negotiable. www.mpi.nl/career-educa...

08.09.2025 13:54 ๐Ÿ‘ 4 ๐Ÿ” 9 ๐Ÿ’ฌ 0 ๐Ÿ“Œ 0
Preview
Rapid Invisible Frequency Tagging (RIFT) with a consumer monitor: A proof-of-concept Rapid Invisible Frequency Tagging (RIFT) enables neural frequency tagging at rates above the flicker fusion threshold, eliciting steady-state responses to flicker that is almost imperceptible. While R...

๐Ÿšจ New preprint: Invisible neural frequency tagging (RIFT) for the underfunded researcher:
๐Ÿ‘‰ www.biorxiv.org/cgi/content/...

RIFT uses high-frequency flicker to probe attention in M/EEG with minimal stimulus visibility and little distraction. Until now, it required a costly high-speed projector.

22.08.2025 11:52 ๐Ÿ‘ 33 ๐Ÿ” 17 ๐Ÿ’ฌ 4 ๐Ÿ“Œ 2
Preview
Using Rapid Invisible Frequency Tagging (RIFT) to Probe the Neural Interaction Between Representations of Speech Planning and Comprehension Abstract. Interlocutors often use the semantics of comprehended speech to inform the semantics of planned speech. Do representations of the comprehension and planning stimuli interact? In this EEG stu...

Using Rapid Invisible Frequency Tagging (RIFT) to probe the neural interaction between representations of speech planning and comprehension. New paper by Cecรญlia Hustรก, Antje Meyer & @lindadrijvers.bsky.social
doi.org/10.1162/nol_a_00171

04.08.2025 06:24 ๐Ÿ‘ 10 ๐Ÿ” 1 ๐Ÿ’ฌ 0 ๐Ÿ“Œ 0
Post image

Call for papers @jneurolang.bsky.social: Towards a Deeper Understanding of the Relationship Between the Neurobiology of Language and Consciousness. ๐Ÿง ๐Ÿ—จ๏ธ

direct.mit.edu/DocumentLibr...

โ˜Ž๏ธ DM me for more details.

24.07.2025 17:10 ๐Ÿ‘ 8 ๐Ÿ” 5 ๐Ÿ’ฌ 0 ๐Ÿ“Œ 0
Preview
(PDF) The Tracking Umbrella: Diverse Interpretations Under a Common Neural Term PDF | Neural tracking, the alignment of brain activity with the temporal dynamics of sensory input, is a crucial mechanism underlying perception,... | Find, read and cite all the research you need on ...

Neural tracking: evoked or oscillatory? ๐ŸŽง๐Ÿง  The #TrackingUmbrella paper argues itโ€™s not either/or โ€” both perspectives reveal how our brains align with speech, music, and more. Method matters. Letโ€™s rethink how we measure and interpret tracking! tinyurl.com/ywbe7rrz

17.07.2025 07:30 ๐Ÿ‘ 8 ๐Ÿ” 3 ๐Ÿ’ฌ 0 ๐Ÿ“Œ 0

deadline 23 June!! Please re-bleat(??) widely!

19.06.2025 18:55 ๐Ÿ‘ 9 ๐Ÿ” 9 ๐Ÿ’ฌ 0 ๐Ÿ“Œ 1

๐Ÿ“ข We are thrilled to announce that the full program for #ISGS2025 is now available on our website: isgs10.nl

โœจExplore the detailed schedule featuring latest advancements in multimodal language research โœจ

13.06.2025 15:47 ๐Ÿ‘ 7 ๐Ÿ” 4 ๐Ÿ’ฌ 0 ๐Ÿ“Œ 3
Calls โ€“ Dutch Society for Brain and Cognition

Don't forget to submit your symposia and panels for the Dutch brain and cognition conference happening in December! www.societyforbrainandcognition.nl/calls/

12.06.2025 13:25 ๐Ÿ‘ 0 ๐Ÿ” 1 ๐Ÿ’ฌ 0 ๐Ÿ“Œ 0

Finally out in @commsbio.nature.com !
Using MEG and Rapid Invisible Frequency Tagging (RIFT) in a classic visual search paradigm we show that neuronal excitability in V1 is modulated in line with a priority-map-based mechanism to boost targets and suppress distractors!

11.06.2025 20:40 ๐Ÿ‘ 18 ๐Ÿ” 4 ๐Ÿ’ฌ 1 ๐Ÿ“Œ 0
Preview
Parallel and dynamic attention allocation during natural reading During natural reading, attention constantly shifts across words, yet how linguistic properties (e.g., lexical frequency) impact the allocation of attention remains unclear. In this study, we co-regis...

Our latest study on reading using MEG, eyetracking and Rapid Invisible Frequency Tagging headed by Yali Pan from @thechbh.bsky.social : Parallel and dynamic attention allocation during natural reading
www.biorxiv.org/content/10.1...

30.05.2025 09:00 ๐Ÿ‘ 26 ๐Ÿ” 7 ๐Ÿ’ฌ 0 ๐Ÿ“Œ 1

๐Ÿ“‰ Key findings: Mothers suffer penalties of 25 percentage points on their annual output of scientific publications compared to fathers in the first seven years after childbirth, and participation in research publishing is 16% lower among mothers relative to fathers.

22.05.2025 08:03 ๐Ÿ‘ 7 ๐Ÿ” 8 ๐Ÿ’ฌ 3 ๐Ÿ“Œ 1

*scream laughs in toddler mom*

26.05.2025 13:02 ๐Ÿ‘ 26 ๐Ÿ” 6 ๐Ÿ’ฌ 1 ๐Ÿ“Œ 0

Great opportunity on a very exciting project!

20.05.2025 15:24 ๐Ÿ‘ 4 ๐Ÿ” 2 ๐Ÿ’ฌ 0 ๐Ÿ“Œ 0
Preview
How the human brain is like a murmuration of starlings | Aeon Essays The brain is much less like a machine than it is like the murmurations of a flock of starlings or an orchestral symphony

๐—ฃ๐—ฎ๐—ฟ๐—ฎ๐—น๐—น๐—ฒ๐—น ๐—ฑ๐—ถ๐˜€๐˜๐—ฟ๐—ถ๐—ฏ๐˜‚๐˜๐—ฒ๐—ฑ ๐—ฏ๐—ฟ๐—ฎ๐—ถ๐—ป ๐—ณ๐˜‚๐—ป๐—ฐ๐˜๐—ถ๐—ผ๐—ป, ๐—ต๐—ผ๐˜„ ๐—ฑ๐—ผ๐—ฒ๐˜€ ๐—ถ๐˜ ๐˜„๐—ผ๐—ฟ๐—ธ?
Don't expect simple mappings between mind and brain.
Check out this piece I wrote for Aeon.
aeon.co/essays/how-t...

19.05.2025 17:28 ๐Ÿ‘ 86 ๐Ÿ” 21 ๐Ÿ’ฌ 4 ๐Ÿ“Œ 2
Preview
Facial clues to conversational intentions It has long been known that we use words to perform speech acts foundational to everyday conversation, such as requesting, informing, proposing, or coโ€ฆ

New review out on facial signalling in conversation, making the argument that the notion of โ€˜speech actsโ€™ is outdated due to its unimodal focus, especially in the light of the fundamental contributions of the face towards social action communication www.sciencedirect.com/science/arti...

12.05.2025 07:10 ๐Ÿ‘ 36 ๐Ÿ” 14 ๐Ÿ’ฌ 3 ๐Ÿ“Œ 0

Het!

08.05.2025 17:53 ๐Ÿ‘ 1 ๐Ÿ” 0 ๐Ÿ’ฌ 0 ๐Ÿ“Œ 0