Interesting!
@fahrenfort
Assistant Prof at VU Amsterdam. Neuroscience of consciousness, decision making. Computational modeling. Pet method: EEG. Critical of subjective measures. Co-PI in the http://consciousbrainlab.com with @svangaal.bsky.social and @timostein.bsky.social.
Interesting!
A timely reminder that social progress doesnβt come easy. The idea of half the population not being allowed to vote might sound bizarre now but was reality until embarrassingly recently.
I gave talk about subjective and objective approaches in the study of conscious perception last week in a BAMΞ workshop @uni-bamberg.de hosted by @johanneskleiner.bsky.social
@jolienfrancken.bsky.social and
@ronyhirsch.bsky.social. You can find the recording here: www.youtube.com/watch?v=S1zB...
Good analysis! Makes you wonder why the authors (or their reviewers) did not consider it necessary to analyse RT data from a similar offline (or pre-2022 online) study using their metrics. It's not that those RT data are not available.
Recently, van der Stigchel and colleagues posted a provocative commentary suggesting that we should be wary of bots in online behavioral data collection (π§΅by @cstrauch.bsky.social here: bsky.app/profile/cstr...). But should we? Here is my response letter osf.io/preprints/ps.... 1/5
What is the brain for? Active inference is widely discussed as a unifying framework for understanding brain function, yet its empirical status remains debated. Our review identifies core predictions across the action-perception cycle and evaluates their empirical support: osf.io/preprints/ps...
Do goal-directed actions minimize prediction error? Together with @haslagter.bsky.social and @fahrenfort.bsky.social I identified falsifiable predictions of active inference and reviewed the extent to which they are supported by empirical results. Read the preprint here: tinyurl.com/2by8k3h6
WHY IS NOBODY PANICKING OVER HUGE DRONES ABOVE SENSITIVE MILITARY / NUCLEAR INSTALLATIONS ACROSS NEW JERSEY, DENMARK, GERMANY, BELGIUM, THE NETHERLANDS ETC. SHOOT THEM DOWN FFS!
Most of you know @suryagayet.bsky.social as a successful visual-attention researcher. But he also had an active #music career as a #rap artist. And like Jay-Z before him, he has briefly come out of retirement with a new album. Check it outβit's very good! π€πΆ open.spotify.com/album/7HrnAB...
it is time to make my biyearly post because I put a preprint outππΎπ. we (w/ @svangaal.bsky.social, Z. van den Hurk, @timostein.bsky.social & @fahrenfort.bsky.social) attempted to replicate a classic unconscious priming study by Vorberg et al. (2003) using a single-subject Bayesian approach.
Great overview conceptualizing approaches for studying sensory conscious perceptionπ. I, myself, had research experience with both subjective and objective approaches, but it was nice to learn the great landscape and where I land. Table 1 is especially informative.
Pre-print π "Subjective and objective approaches in the study of conscious perception" will be a chapter in www.horizon-minds.com. We explain that subjective and objective are poorly defined constructs and provide a taxonomy.
doi.org/10.31234/osf...
with @svangaal.bsky.social @timostein.bsky.social
What could possibly go wrong?!
nypost.com/2025/10/16/b...
I feel seen...
Pre-print π "Subjective and objective approaches in the study of conscious perception" will be a chapter in www.horizon-minds.com. We explain that subjective and objective are poorly defined constructs and provide a taxonomy.
doi.org/10.31234/osf...
with @svangaal.bsky.social @timostein.bsky.social
I thought we only did this on April 1st?
Figuring out how the brain uses information from visual neurons may require new tools, writes @neurograce.bsky.social. Hear from 10 experts in the field.
#neuroskyence
www.thetransmitter.org/the-big-pict...
There is no eye tracking in this paper, just pupil size and the code is specifically about how to provide the biofeedback on pupil size. This is not difficult to set up once you have an eye tracker. I don't think they have "built" an eye tracker but ok. Also, where are the analysis scripts?
Even if this is true, I think it's stupid. It's not a high-tech thing to provide feedback on pupil size, everybody can do this. The work is not the code here, the work is doing something useful/marketable with it. No mention of the analysis scripts either.
Also, where is the analysis code? Surely there is a lot more to this paper than just the biofeedback algorithm?
Code availability: The code of the pupil-based biofeedback algorithm cannot be made publicly available since it is proprietary software of ETHZurich and cannot be shared beyond the detailed description of the algorithm given in the methods section. However, researchers interested in verifying and reproducing our results can do so on location in a secured environment at the Neural Control of Movement Laboratory, ETH Zurich, upon signing a confidentiality agreement.
Who thinks this is an acceptable statement about Code Availability given the move towards Open Science? Are you out of your mind @ethz.ch? Today is 2025, not 2005. I'm also surprised that @natcomms.nature.com accepts such a statement. It is ridiculous really. Paper here: doi.org/10.1038/s414...
australian street style, 1973
That would be a positive outcome
Same. This was actually quite good.
This is impressively good
My god you're useless.
Just sign the damn petition Edwin!
It's looking increasingly less likely that AI will take over the world and replace humanity in the process. Maybe GPT-6?
The EU wants to ban words like "burger" for veggie burgers. What are we supposed to call it then? Veggie thing formerly known as burger? Go do something useful with your time idiots, this is why the UK left. Please sign petition and stop the meat lobby.
weplanet.yourmovement.org/p/noconfusio...