ScienceDirect.com | Science, health and medical journals, full text articles and books.
Excited to share that our MEG project is now out in Current Biology! We show how visual content codes relate to motor oscillations in telling time.
Huge thanks to Quirin Gehmacher, Peter Kok, Matt Davis and Clare Press (bsky links below).π§΅
authors.elsevier.com/sd/article/S...
06.03.2026 17:29
π 35
π 14
π¬ 1
π 4
Had a quick play around in Coder View on my Mac and it feels incredibly light weight. Bravo!
24.02.2026 13:21
π 2
π 0
π¬ 0
π 0
π£ PsychoPy Studio is out now!!
β¨ Which also means swish new branding on psychopy.org β¨
We really hope that this improves user experience in general.
Please download and try it. Let us know how you get on via the forum discourse.psychopy.org/t/introducin...
23.02.2026 16:30
π 20
π 11
π¬ 1
π 4
Aw shucks :)
22.02.2026 21:08
π 3
π 0
π¬ 0
π 0
Will this be recorded per chance? :)
22.02.2026 17:34
π 2
π 0
π¬ 1
π 0
Program for the Brisbane Experimental Psychology Student Initiative meeting at 4pm, Wednesday 25th February at The University of Queensland. Featuring Timothy Cottier, Postdoctoral Fellow, Western Sydney University, presenting 'Evidence that super-recognisers spontaneously and preferentially process face-identity information' and
Tim Gastrell, PhD Student, Queensland Brain Institute presenting 'Neural tuning to visual motion depends on the precision of learned priors'.
Our first meeting for 2026 is just 5 days away!
To kick things off we have 2 awesome Tims presenting their research @tvcottier.bsky.social & @tgastrell.bsky.social
You can join in-person or online via Zoom - register here to receive the link π
uniofqueensland.syd1.qualtrics.com/jfe/form/SV_...
20.02.2026 08:26
π 7
π 1
π¬ 1
π 2
It was my pleasure! :)
FYI: @psychopy.org is looking for more Australian Ambassadors. Definitely reach out if you're interested!
16.02.2026 22:24
π 4
π 0
π¬ 0
π 0
ππ₯Ή
16.02.2026 07:12
π 3
π 0
π¬ 0
π 0
Photos of the seven-strong committee for the Brisbane Experimental Psychology Student Initiative
Introducing @brisepsi.bsky.social - we're an experimental psychology initiative run by grad students in Brisbane (Meanjin) across UQ & QUT π§
We host a monthly dose of freshly baked research (students, ECRs, & bigwigs) followed by social goodness. Help us grow with a follow! π€ #neuroskyence
16.02.2026 05:58
π 18
π 7
π¬ 1
π 0
AI is not a peer, so it canβt do peer review
If we still believe thatΒ science is a vocationΒ grounded in argument, curiosity and care, we canβt delegate judgement to machines, saysΒ Akhil Bhardwaj
'to treat peer review as a throughput problem is to misunderstand what is at stake. Review is not simply a production stage in the research pipeline; it is one of the few remaining spaces where the scientific community talks to itself.' 1/3
03.02.2026 08:17
π 367
π 156
π¬ 6
π 20
Heβs simply a grifter
28.01.2026 18:44
π 0
π 0
π¬ 0
π 0
Interpreting EEG requires understanding how the skull smears electrical fields as they propagate from the cortex. I made a browser-based simulator for my EEG class to visualize how dipole depth/orientation change the topomap.
dbrang.github.io/EEG-Dipole-D...
Github page: github.com/dbrang/EEG-D...
20.01.2026 17:00
π 124
π 49
π¬ 4
π 1
Most popular decision-making models assume that cognitive processes are static over time. In our new paper in Psych Review, we offer a simple extension to evidence accumulation models that lets researchers account for systematic changes in parameters across time π
psycnet.apa.org/fulltext/202...
20.01.2026 22:25
π 28
π 9
π¬ 1
π 2
a cartoon of donald duck says " and a bah humbug " to you
ALT: a cartoon of donald duck says " and a bah humbug " to you
Our publishing system does not prioritise or value the careful curation of research data to be FAIR nearly enough. I have been data editing for AP&P for a year now, and it is sad to see no reward for the clearly careful organisation of data and materials vs that which is thrown on OSF with no care!
18.01.2026 02:33
π 24
π 5
π¬ 1
π 1
Thanks Junjie!
28.12.2025 09:19
π 0
π 0
π¬ 0
π 0
Also, I apologise for the poor figure quality in the HTML version of the article. Elsevierβs typesetting team made some nonsense changes that I did not consent to, which have somehow proved to be frustrating to fix on their end.
The PDF version is fine though!
12.12.2025 08:11
π 0
π 0
π¬ 0
π 0
Iβd like to thank my co-authors (particularly Naohide and Jonny) and reviewers for helping me elevate the quality of this work π
12.12.2025 08:11
π 0
π 0
π¬ 1
π 1
Perhaps the coolest result was that these surprise signals were *shared across attributes*. That is, classifiers trained to decode surprise for shape could reliably do so for colour (and vice versa), after accounting for latency shifts.
12.12.2025 08:11
π 0
π 0
π¬ 1
π 0
Interestingly, we were still able to decode multivariate whole-scalpe representations of surprise (neutral vs. violation) separately for each attribute. Moreover, these signals were reliable from ~250 ms, suggesting that surprise is predominantly signalled after the initial feedforward sweep.
12.12.2025 08:11
π 0
π 0
π¬ 1
π 0
We first looked at the evoked responses and found classic effects of adaptation via the constant vs. change sequence comparisons.
This said, we found no evidence for visual surprise after controlling for cortical adaptation (i.e., when comparing surprising changes to neutral changes).
12.12.2025 08:11
π 0
π 0
π¬ 1
π 0
Here, we recorded EEG from participants who viewed sequences of a bound object that changed in either colour or shape over four steps. Crucially, the contexts of these changes were designed to appear random (and unsurprising) or violate the established trajectory (and cause surprise).
12.12.2025 08:11
π 0
π 0
π¬ 1
π 0
But when does the visual system signal surprise? And do the dynamics of a surprise signal depend on which attributes (features) violate a prediction? This is important to think about, given the functionally segregated organisation of the visual system.
12.12.2025 08:11
π 0
π 0
π¬ 1
π 0
Predictive coding theories assert that the brain uses prior knowledge when resolving percepts. Deviations between what is predicted and sensed generate surprise signals (so-called βprediction errorsβ), which calibrate the relevant erroneous predictions.
12.12.2025 08:11
π 1
π 0
π¬ 1
π 0
And it's out now in Cortex: www.sciencedirect.com/science/arti...
Summary below π§΅
12.12.2025 08:11
π 18
π 5
π¬ 1
π 0
Redirecting
This suggests that visual surprise may operate at the bound object level and/or is a domain-general response.
This is identical to the conclusions drawn from our previous work :)
doi.org/10.1016/j.co...
12.12.2025 07:54
π 0
π 0
π¬ 0
π 0
Perhaps the coolest result was that surprise signals were *shared across attributes*. That is, classifiers trained to decode surprise for shape could reliably do so for colour (and vice versa), after accounting for latency shifts.
12.12.2025 07:54
π 0
π 0
π¬ 1
π 0
Interestingly, we were still able to decode whole-scalp multivariate representations of surprise (neutral vs. violation) separately for each attribute. Moreover, these signals were reliable from ~250 ms, suggesting that surprise is predominantly signalled after the initial feedforward sweep.
12.12.2025 07:54
π 0
π 0
π¬ 1
π 0
We first looked at the evoked responses and found classic effects of adaptation via the constant vs. change sequence comparisons.
This said, we found no evidence for visual surprise after controlling for cortical adaptation (i.e., when comparing surprising changes to neutral changes).
12.12.2025 07:54
π 0
π 0
π¬ 1
π 0
Here, we recorded EEG from participants who viewed sequences of a bound object that changed in either colour or shape over four steps. Crucially, the contexts of these changes were designed to appear random (and unsurprising) or violate the established trajectory (and cause surprise).
12.12.2025 07:54
π 0
π 0
π¬ 1
π 0