#neurojob #cogpsych #neuroskyence #visionScience #academicsky
#neurojob #cogpsych #neuroskyence #visionScience #academicsky
10 PhD positions at JLU Giessen in the new Research Training Group "PIMON"! We will explore how humans perceive and interact with materials and objects in natural environments.
More information on the project, the PIs, and how to apply here:
www.uni-giessen.de/de/ueber-uns...
Please share!
π¨ Preprint alert! π¨
Check out @alpekinci.bsky.social 's preprint: βShared gaze reflects shared aesthetic experiencesβ #neuroaesthetics π
@predictivebrain.bsky.social , @buffalosentence.bsky.social , @liadmudrik.bsky.social , @bencenanay.bsky.social , @clarepress.bsky.social , @sampendu.bsky.social , @clairesergent.bsky.social.
4/4
Participation fee is 240 β¬, covering accommodation and meals during the workshop.
Travel to Rauischholzhausen is organized individually.
Registration deadline is March 8, 2026.
3/4
In addition to attending, we invite participants to contribute posters and discuss their own research.
Register and indicate your poster contribution here:
forms.gle/V2zHVsiRBvKU...
2/4
The workshop covers diverse aspects of perceptual inferences and how they shape our conception of the world. Talks by exciting international speakers span philosophy, behavioral and neuroimaging work, ranging from low level vision and social perception to unconscious processing.
π’ Workshop announcement.
We are super excited to announce the workshop Perceptual Inferences, from philosophy to neuroscience, organized by Alexander SchΓΌtz and Daniel Kaiser.
π Rauischholzhausen Castle, near Marburg, Germany
ποΈ June 8 to 10, 2026.
1/4
π¨ Preprint alert! π¨
Check out @suzibot.bsky.social's preprint: βVisual search is constrained by the variability of object-category templates.β
Includes some neat findings on individual differences in search π
Weβre happy to announce that Gongting Wang has successfully defended his PhD thesis at Freie UniversitΓ€t Berlin. Congratulations, Dr. Wang! π
π¨ Preprint alert! π¨
Check out @michaengesee.bsky.social 's preprint on individual differences in expectations about natural scenes and how they shape how we perceive and neurally represent scenes. ποΈπ§ π¨βπ¦±π©βπ¦°π©βπ¦³
2025 - Christmas Party Crew! π
Here's a press release (in German):β¨www.uni-giessen.de/de/ueber-uns...β¨β¨
And here's tagging some of the great people involved: @kathadobs.bsky.social , @martinhebart.bsky.social , @haplab.bsky.social, @peelen.bsky.social .
Super happy to announce that our Research Training Group "PIMON" is funded by the @dfg.de ! Starting in October, we will have exciting opportunities for PhD students that want to explore object and material perception & interaction in GieΓen @jlugiessen.bsky.social ! Just look at this amazing team!
I am very excited to share our new preprint, spearheaded by the brilliant @lunahuestegge.bsky.social, w/ @peterkok.bsky.social and others: βAn attempt to push mental imagery over the reality threshold using non-invasive brain stimulationβ
doi.org/10.31234/osf...
Decoding the rhythmic representation and communication of visual contents
www.cell.com/trends/neuro...
#neuroscience
From Michaβs farewell gathering before his temporary leave π« - heβll still be missed!
A few snapshots from this yearβs Kaiser Lab retreat πΏπ§ β¨
Together with @seeingxie.bsky.social , @singerjohannes.bsky.social , Bati Yilmaz, @dkaiserlab.bsky.social y.social , Radoslaw M. Cichy.
By utilizing the visual backward masking paradigm, this study aimed to disentangle the contributions of feedforward and recurrent processing, revealing that recurrent processing significantly shapes the object representations across the ventral visual stream.
journals.plos.org/plosbiology/...
How can we characterize the contents of our internal models of the world? We highlight participant-driven approaches, from drawings to descriptions, to study how we expect scenes to look! π€©
In this paper we present flexible methods for participants to express their expectations about natural scenes.
Time to expand how we study natural scene perception! π
w/ @michaengesee.bsky.social , @suzibot.bsky.social l, Ilker Duymaz, Gongting Wang, Matthew J. Foxwell, Radoslaw M. Cichy, David Pitcher & @dkaiserlab.bsky.social
From line drawings to scene perception β our new review argues for moving beyond experimenter-driven manipulations toward participant-driven approaches to reveal whatβs in our internal models of the visual world. ποΈβοΈπ
royalsocietypublishing.org/doi/10.1098/...
Kaiser Lab is at ECVP this year! Come check out our studies π
Go Lu!! Congratulations π
We had an awesome time discussing the role of feedback in perception @unil.bsky.social π§ and exploring the beautiful city of Lausanne! πποΈ Thanks to @icepfl.bsky.social @davidpascucci.bsky.social
All-topographic ANNs! Now out in @nathumbehav.nature.com , led by @zejinlu.bsky.social , in collaboration with @timkietzmann.bsky.social.
See below for a summary π
Donβt miss out again! If you are interested in our studies presented at @VSSMtg , you can find our posters here: π
drive.google.com/drive/mobile...