Public communication alters private confidence
Andreassen et al. demonstrate that confidence exhibited in public affects our private
assessment of confidence.
How does uncertainty transmit from one head to another? Our new paper out today in @currentbiology.bsky.social reveals how public communication alters private confidence.
w/ Einar Andreassen & @cdfrith.bsky.social
@birkbeckpsychology.bsky.social
@leverhulme.ac.uk
π§ π
09.03.2026 16:20
π 56
π 23
π¬ 0
π 0
Every time you experience something new, your brain faces a decision: Should it update an existing memory or create a new one?
In our new paper in @sfnjournals.bsky.social #JNeurosci, we isolate that exact decision, moment-by-moment during learning π§΅
06.03.2026 18:54
π 131
π 46
π¬ 3
π 1
Excited to share new work on how the brain makes social inferences from visual input! π§ π―ββοΈ
(With @lisik.bsky.social , @shariliu.bsky.social, @tianminshu.bsky.social , and Minjae Kim!) www.biorxiv.org/content/10.6...
26.02.2026 22:09
π 45
π 16
π¬ 1
π 2
How do we make new friends after moving? Primarily through work/classes or through friends of friendsβ¦.and making friends via βfriends of friendsβ predicts greater happiness, meaning, and psychological richness
@chriswelker.bsky.social #SPSP2026
26.02.2026 17:13
π 17
π 4
π¬ 0
π 1
Our BehaveAI paper has just come out!
Easy & effective tracking & behavioural classification, even with tiny (2px), fast moving, camouflaged objects.
Paper: doi.org/10.1371/jour...
Download: github.com/troscianko/B...
@uniexecec.bsky.social @kevinjgaston.bsky.social @jimamclgalloway.bsky.social
21.02.2026 09:02
π 87
π 36
π¬ 4
π 1
π§΅ New paper to appear in The Handbook of Linguistics and Multimodality: βCommunicating from head to toeβ, with @acwiek.bsky.social and @susfuchs.bsky.social π₯° We review evidence that language is fundamentally physical, shaped by anatomy and biomechanics from your skull to your soles. π£οΈπ¦΅ [1/6]
20.02.2026 11:08
π 11
π 4
π¬ 1
π 0
Interrupting my regular whimsical watercolor painting schedule to show you all my other creative outputs: science! π₯Ή
11.02.2026 20:39
π 11
π 2
π¬ 0
π 0
Fig. 1. a. Visual and auditory regions of interest (ROIs). b. Responses in a combination of visual (e.g., early dorsal visual stream; Fig. 1a, middle panel) and auditory regions were used to predict responses in the rest of the brain using MVPN. c. In order to identify brain regions that combine responses from auditory and visual regions, we identified voxels where predictions generated using the combined patterns from auditory regions and one set of visual regions jointly (as shown in Fig. 1b) are significantly more accurate than predictions generated using only auditory regions or only that set of visual regions.
Iβm excited to share my 1st first-authored paper, βDistinct portions of superior temporal sulcus combine auditory representations with different visual streamsβ (with @mtfang.bsky.social and @steanze.bsky.social ), now out in The Journal of Neuroscience!
www.jneurosci.org/content/earl...
02.10.2025 15:20
π 22
π 11
π¬ 1
π 0
Very excited to share @landrybulls.bsky.social's 1st lead-author preprint in my lab! Using datasets from MySocialBrain.org we measured people's beliefs about how mental states change in intensity over time, the dimensional structure of those beliefs, and their correlates: osf.io/preprints/ps... π§΅π
16.09.2025 15:08
π 21
π 4
π¬ 0
π 0
Of course, none of this would have been at all possible without the amazing @dianatamir.bsky.social and my super-advisor @markthornton.bsky.social - thank you both!
16.09.2025 14:52
π 0
π 0
π¬ 0
π 0
These findings indicate that peopleβs beliefs about mental state intensity dynamics are incorporated into a wide variety of different domains and generalize across cultures and beyond the lab. Our findings may lay the basis for future research on how people acquire/use mental state concepts π
16.09.2025 14:50
π 1
π 0
π¬ 1
π 0
We then show that the structure of people's beliefs about intra-state dynamics reflect how people make other mental state judgments (namely, their conceptual similarity and transition probability) and how different mental state words are used in written text across a variety of cultures.
16.09.2025 14:50
π 0
π 0
π¬ 1
π 0
We go on to characterize how these mental statesβ intensity profiles interrelate using a curve similarity metric called Frechet distance, which captures similarity between two curvesβ overall shape while abstracting away from the specific indices that correspond to different curve features.
16.09.2025 14:50
π 1
π 0
π¬ 1
π 0
These temporal motifs clearly map on to interpretable psychological dimensions: high/low arousal, duration/ending abruptness, and perceptibility/traitlikeness. We discuss the relationship between the shape of each component's loading and its psychological correlates.
16.09.2025 14:49
π 0
π 0
π¬ 1
π 0
Using PCA, we found that three temporal motifs explained a large majority of the variance in people's drawn intensity profiles, with overall intensity, slope, and variability emerging as the three dimensions of people's beliefs about intra-state dynamics.
16.09.2025 14:49
π 1
π 0
π¬ 1
π 0
Using data collected in a curve-drawing task, we measured people's beliefs about these dynamics for individual mental statesβthese are called mental state intensity profiles. This low-d UMAP embedding shows the variability in average curves for different mental states.
16.09.2025 14:48
π 1
π 0
π¬ 1
π 0
Experiencing a mental state like joy, confusion, anger, or concentration is a dynamic process that ebbs and flows in intensity over time. A moment of shock may quickly come and go, a flow state might rise gradually and then vanish, or a spark of joy may rise to a crescendo before fading away.
16.09.2025 14:46
π 1
π 0
π¬ 1
π 0
Excited to share the preprint for my 1st 1st-author manuscript! @markthornton.bsky.social and I show that people hold robust, structured beliefs about how individual mental states unfold in intensity over time. We find that these beliefs are reflected in other domains of mental state understanding.
16.09.2025 14:46
π 34
π 6
π¬ 2
π 1
Original members of SCRAP Lab
Current members of SCRAP Lab
Today, SCRAP Lab returned (right) to the Path of Life Garden in Windsor, VT - the site of our first in-person get-together as a lab 5 years ago (left) - to welcome our newest member, graduate student @gabefajardo.bsky.social!
04.09.2025 23:20
π 17
π 4
π¬ 0
π 0
Effect of confound mass on true positive rates under FDR correction. Confound mass represents how large a confound is in terms of the product of its voxel extent and effect size. Results are shown at differing combinations of true effect size, true effect voxel extent, and sample size.
Inflated surface maps of meta-analytic z-statistics from Neurosynth for low-level confounds (top) and high-level cognitive tasks (bottom). Red reflects positive activations, blue reflects negative (de)activations, and darker colors indicate larger z-statistics. Maps are thresholded at |z| = 1 for visualization purposes.
Effect of confound effect size on true positive rates for task effects under FDR correction. Colors indicate sample sizes: N = 25 in blue, N = 50 in green, and N = 100 in orange. Effect sizes are reflected by the darkness of each color, with light shades representing d = .2, medium d = .5, and dark d = .8. The task brain maps and confound brain maps referenced in each panel are shown in Figure 3.
Effect of FDR-based publication bias on observed confound effects sizes. Simulated meta-analytic confound effect sizes are visualized through violin plots for each combination of task effect and confound effect examined in the neural data simulations. Meta-analyses featuring publication bias (orange) substantially inflate these effect size estimates in all cases, relative to meta-analyses featuring no publication bias (blue).
After 5 years, I finally carved out time to turn this blog post on FDR (markallenthornton.com/blog/fdr-pro...) into a manuscript. The preprint features a much broader range of simulations showing how FDR promotes confounds, and how this effect compounds with publication bias: osf.io/preprints/ps...
29.08.2025 15:43
π 49
π 15
π¬ 3
π 1
New paper from me at Cognition and Emotion! "Deep neural network models of emotion understanding" I discuss how deep nets can be used as cognitive models of emotion perception, prediction, and regulation: doi.org/10.1080/0269...
(h/t @ltjaql.bsky.social for the illustrations!)
07.08.2025 15:20
π 33
π 10
π¬ 1
π 0
Excited to share the DIMS Dashboardβa tool for displaying multimodal, extracted time series alongside the original video source! Itβs designed to support and inspire a richer qualitativeβquantitative research cycle.
Huge thanks to my amazing collaborators and mentors who made this possible! π
15.05.2025 14:44
π 10
π 7
π¬ 0
π 0
Postprint: osf.io/987fm_v1 To appear in Proceedings of Cog Sci 2025
DIMS Dashboard for Exploring Dynamic Interactions and Multimodal Signals.
The interdisciplinary @graceqmiao.bsky.social in the lead here! Developing a dynamic dashboard for a quali-quanti social neuroscience research cycle!
12.05.2025 13:00
π 8
π 2
π¬ 0
π 1
OSF
New preprint! Thrilled to share my latest work with @esfinn.bsky.social -- "Sensory context as a universal principle of language in humans and LLMs"
osf.io/preprints/ps...
05.05.2025 14:49
π 51
π 19
π¬ 3
π 3
Alternative title: "LLMs don't have ears (or eyes)"
What do humans and machines miss out on when processing language as purely written text, without all the embodied audiovisual richness that scaffolds language in daily human contexts?
Very proud of this elegant work from @tommybotch.bsky.social
05.05.2025 23:53
π 43
π 6
π¬ 2
π 0
The representation of mood in primate anterior insular cortex
Understanding how the brain reflects and shapes mood requires resolving the disconnect between behavioral measures of mood that can only be made in humans (typically based on subjective reports of hap...
A putative neural correlate of mood!
One big (scandalous?) idea, simple analyses, and the STRONGEST brain/behavior correlation I've EVER seen (which is shocking, given that it's mood).
Work with: You-Ping Yang, @catrinahacker.bsky.social and Veit Stuphorn.
www.biorxiv.org/content/10.1...
25.04.2025 13:49
π 167
π 48
π¬ 9
π 4
SCRAP Lab had a great time at #SANS2025! Can't wait till next year!
26.04.2025 22:27
π 38
π 5
π¬ 0
π 0
Welcome Gabe!!
25.04.2025 03:12
π 2
π 0
π¬ 1
π 0