IMAGINE-decoding-challenge
Predict which words participants were hearing, based upon brain activity recordings of visually seeing these items?
How well do classifiers trained on visual activity actually transfer to non-visual reactivation?
#Decoding studies often rely on training in one (visual) condition and applying it to another (e.g. rest-reactivation). However: How well does this work? Show us what makes it work and win up to 1000$!
24.10.2025 06:55
π 32
π 14
π¬ 3
π 3
preprint alert π¨
1/ Can we accurately detect sequential replay in humans using Temporally Delayed Linear Modelling (#TDLM)? In our recent study, we could not find any replay and decided to dig deeper by running a hybrid simulation with surprising results. Link to preprint & details below π
16.06.2025 07:22
π 56
π 27
π¬ 2
π 2
π Join Us for the Next Mannheim Open Science Meetup! π
π¬ Topic: ARIADNE β A Scientific Navigator to Find Your Way Through the Research Resource Labyrinth
π Speaker: ΓaΔatay GΓΌrsoy, Central Institute of Mental Health
π
Date & Time: April 30, 2025, 3:00 PM
π Location: Register online lnkd.in/exErgJfy
20.03.2025 12:04
π 13
π 5
π¬ 2
π 2
Also, credit where credit is due: Figure 2 is based on code by the amazing @tomhardwicke.bsky.social and this paper doi.org/10.1098/rsos...! Thanks for sharing your code openly, Tom!
08.04.2025 11:35
π 1
π 0
π¬ 1
π 0
Proud to share this new meta-science articleβour analysis of 255 preclinical opioid addiction studies highlights a pressing need for better transparency and reproducibility. Big thanks to Justine Blackwell and @alexh.bsky.social β it was an honor working with you on this project! :)
08.04.2025 08:54
π 14
π 6
π¬ 1
π 0
I can highly recommend this opportunity for postdocs in cognitive neuroscience π
04.02.2025 09:27
π 1
π 1
π¬ 0
π 0
OSF
New preprint on "Attitudes Toward Open Science Practices"!
We asked 596 German psychologists about their worries and hopes towards open science practices!
ECRs reported more worries AND more fears, but the more you use OS the less worries and the more hopes you have.
osf.io/preprints/ps...
28.01.2025 09:50
π 21
π 6
π¬ 1
π 0
I made one for stats papers
18.11.2024 04:02
π 545
π 149
π¬ 15
π 28
@juliabeitner.bsky.social presents LIFOS, a platform where students can learn about and train #OpenScience practices in a safe practice environment. πβ¨ #dgps2024
18.09.2024 10:42
π 4
π 1
π¬ 0
π 1
If you are interested in our holistic Open Science training platform, dedicated to students, you can find the slides from my talk here ππ
osf.io/ug6mk
18.09.2024 13:43
π 2
π 0
π¬ 0
π 0
π¨ Deadline extended: Weβve set the #ManyBeds registration deadline to August 31st! ποΈ If you're interested in contributing to sleep and memory research, thereβs still time to get involved. π§ π€
Join us and help advance this important work.
Sign up now! π
09.08.2024 12:28
π 0
π 0
π¬ 0
π 0
OSF
New findings of disappointing rates of methodological rigor and transparency! π
We coded all 255 papers we found on animal models of opioid addiction published between 2019 and 2023.
Rates of bias minimization practices and sample size calculations were.. unsatisfactory.
osf.io/preprints/ps...
31.07.2024 22:19
π 16
π 4
π¬ 1
π 0
Yes, certainly! For the replication part, the hypotheses mainly concern behavioral findings and only one hypothesis is about the sleep EEG. We will upload the hypotheses soon. You can also team up with another person with EEG experience if you like. Let me know if you have any more questions :)
25.07.2024 13:45
π 1
π 0
π¬ 0
π 0
Thank you Juli! π
25.07.2024 10:31
π 0
π 0
π¬ 0
π 0
Flyer of the ManyBeds project. The left side shows the logo in white on a colorful gradient background. On the right is a brief description of the study and the link to the projectβs website.
Interested in sleep and memory research? π€π§ Then join the #ManyBeds project! ποΈ A multi-lab, many-analysts, replication study led by @gordonfeld.bsky.social and me.
We seek contributors for data collection and analysis.
Sign up now! β¨
Learn more here: cimh-clinical-psychology.github.io/ManyBeds/
24.07.2024 14:17
π 15
π 12
π¬ 1
π 2
Congratulations! ππ₯³
28.05.2024 09:53
π 1
π 0
π¬ 0
π 0
RETRACTED: A Perception Study for Unit Charts in the Context of Large-Magnitude Data Representation
Unit charts are a common type of chart for visualizing scientific data. A unit chart is a chart used to communicate quantities of things by making the number of symbols on the chart proportional to th...
An article about data visualization was retracted 1.5 years after I pointed out errors.
The notice says that "concerns were raised".
I spend dozens of hours contacting authors and editors, reproducing analyses, and following up on ignored emails.
But I'm not mentioned in the retraction notice.
24.05.2024 11:24
π 42
π 14
π¬ 3
π 2
Express interest in repliCATS workshops
The repliCATS project will run a series of workshops in 2024 as part of the SMART: Preprints project in collaboration with the Center for Open Science.
The goal of repliCATS (Collaborative Assessment...
Hey everyone, we have 5 spots left for our in-person repliCATS workshop in Nairobi! If you are going to SIPS2024 @improvingpsych.bsky.social we will be running a workshop on day 2, with US$250 travel grant for all participants. Express interest: forms.gle/WcbH6Ufizobb... [re-skeets welcome].
22.05.2024 01:09
π 13
π 8
π¬ 0
π 2
Zoom on section b) of a figure displaying the experimental procedure. Julia can be seen modeling a participant in VR and in the desktop screen experiment.
Lastly, Iβm really grateful for my colleagues of the SceneGrammarLab who made this study happen and allowed me to include photographs of myself in a figure. Look mom, Iβm a model in a scientific paper! πββοΈ 4/4
21.05.2024 11:47
π 0
π 0
π¬ 0
π 0
We also compared the procedure between both VR and 3D desktop screen settings and found only slight differences. This indicates that the screen setup could elicit comparable behavior as in VR. More research is needed to dive into the nuances and underlying processes 3/4
21.05.2024 11:45
π 2
π 0
π¬ 1
π 0
In line with our expectations, limited visual input did not impact or even benefited incidental memory of encountered objects. Moreover, spatial memory of scenes only seen with a flashlight was overall just as good as memory of illuminated scenes! 2/4
21.05.2024 11:45
π 1
π 0
π¬ 1
π 0
Photo of the Sydney Opera House and Harbour Bridge, framed by mangrove branches.
Again, we have four(!) new positions here in psychology at the University of Sydney.
Like N. American tenure-track, but easier to get "tenure" (become permanent). Let me know of any questions! usyd.wd3.myworkdayjobs.com/USYD_EXTERNA... photo: Alex Holcombe
17.03.2024 23:52
π 29
π 26
π¬ 1
π 0
Thank you Alex! Means a lot :)
18.03.2024 09:20
π 0
π 0
π¬ 0
π 0
Thank you Lisa! βΊοΈ
17.03.2024 08:57
π 0
π 0
π¬ 0
π 0
As a psychologist I donβt know about sociology, but it certainly is a question of interest in psychology!
16.03.2024 11:46
π 1
π 0
π¬ 1
π 0
The title says βVisual search goes real: Transitioning from the experimentalistβs laboratory to more naturalistic settingsβ. I studied visual search in VR and evaluate the field as well as VR in terms of ecological validity :)
16.03.2024 11:37
π 0
π 0
π¬ 1
π 0