Noa Cemeljic's Avatar

Noa Cemeljic

@noacemeljic

PhD Student, Somatosensation & Gargalesis Lab Department of Neuroscience, Karolinska Institutet

29
Followers
28
Following
12
Posts
11.02.2025
Joined
Posts Following

Latest posts by Noa Cemeljic @noacemeljic

5/5 Finally, huge thanks to my co-author and supervisor, Konstantina Kilteni (@kkilteni.bsky.social) , for her invaluable input and guidance during this work!

24.02.2026 09:31 πŸ‘ 0 πŸ” 0 πŸ’¬ 0 πŸ“Œ 0

4/5 We propose that the lack of vision reduced the ability of the forward model to sharpen its predictions over time. These findings provide novel insights into how vision contributes to the predictive mechanisms in the brain and advance our understanding of predictive motor control.

24.02.2026 09:29 πŸ‘ 0 πŸ” 0 πŸ’¬ 0 πŸ“Œ 0
Alt text: Perceived intensity of the forces applied in Vision and No vision sessions. (A-B) Baseline-normalized individual PSE values of every trial group in Vision (A) and No vision (B) sessions, overlaid with the linear regression model fitted to the group data. Data are jittered (Β± 1%) to avoid complete overlap, and the shaded area depicts the 95% confidence interval. The two slope values (Ξ²) represent the average slopes in Vision and No vision sessions, and asterisks indicate the significance of slope comparisons against zero (*** p < 0.001). (C) Slopes of the linear regression models fitted to the individual normalized PSE values. In the Vision session (red stripes), the slopes were negative and significantly different from zero, while the slopes in the No vision session (cyan stripes) were not significantly different from zero. The slopes were significantly more negative (i.e., steeper) in the Vision session compared to the No vision session. (D) Intercepts of the linear regression models fitted to the individual normalized PSE values. The intercepts were significantly more negative in the No vision compared to the Vision session. (C, D) The markers represent the individual slope and intercept values, boxplots show the medians and interquartile ranges, and half-violin plots illustrate the data distributions as probability densities. Asterisks indicate the significance of statistical comparisons (* p < 0.05).

Alt text: Perceived intensity of the forces applied in Vision and No vision sessions. (A-B) Baseline-normalized individual PSE values of every trial group in Vision (A) and No vision (B) sessions, overlaid with the linear regression model fitted to the group data. Data are jittered (Β± 1%) to avoid complete overlap, and the shaded area depicts the 95% confidence interval. The two slope values (Ξ²) represent the average slopes in Vision and No vision sessions, and asterisks indicate the significance of slope comparisons against zero (*** p < 0.001). (C) Slopes of the linear regression models fitted to the individual normalized PSE values. In the Vision session (red stripes), the slopes were negative and significantly different from zero, while the slopes in the No vision session (cyan stripes) were not significantly different from zero. The slopes were significantly more negative (i.e., steeper) in the Vision session compared to the No vision session. (D) Intercepts of the linear regression models fitted to the individual normalized PSE values. The intercepts were significantly more negative in the No vision compared to the Vision session. (C, D) The markers represent the individual slope and intercept values, boxplots show the medians and interquartile ranges, and half-violin plots illustrate the data distributions as probability densities. Asterisks indicate the significance of statistical comparisons (* p < 0.05).

3/5 We again demonstrated the gradual somatosensory attenuation when movements were performed with vision. Importantly, this temporal tuning was reduced in the absence of visual input, as the somatosensory perception was more uniformly, rather than gradually, attenuated throughout the movement.

24.02.2026 09:29 πŸ‘ 1 πŸ” 0 πŸ’¬ 0 πŸ“Œ 0
Alt text: An illustration of the experimental setup. The participants completed two experimental sessions where they had to perform the movements to self-touch while having full vision available (Vision session) or being blindfolded (No Vision session). Participants made a reaching movement with their right hand from the starting position towards one of the force sensors placed above their left index or ring finger, concluding the movement with a tap on the sensor with their right index finger. The motion sensor attached to the right index finger tracked participants’ movements in every trial. The tactile stimuli were delivered to their left index or ring finger through the probe attached to the motor.

Alt text: An illustration of the experimental setup. The participants completed two experimental sessions where they had to perform the movements to self-touch while having full vision available (Vision session) or being blindfolded (No Vision session). Participants made a reaching movement with their right hand from the starting position towards one of the force sensors placed above their left index or ring finger, concluding the movement with a tap on the sensor with their right index finger. The motion sensor attached to the right index finger tracked participants’ movements in every trial. The tactile stimuli were delivered to their left index or ring finger through the probe attached to the motor.

2/5 Here, we investigated the contribution of vision to this temporal attenuation of somatosensory perception. The participants discriminated the forces applied to their left hand during the right hand’s reaching movement towards the left hand, while performing movements with and without vision.

24.02.2026 09:28 πŸ‘ 0 πŸ” 0 πŸ’¬ 0 πŸ“Œ 0

1/5 Using the internal forward model, the brain can predict and attenuate self-touch. In our previous study, we showed that these predictions lead to a gradual attenuation of somatosensory perception (i.e., the touch is perceived as weaker) as the movement to self-touch progresses.

24.02.2026 09:27 πŸ‘ 0 πŸ” 0 πŸ’¬ 0 πŸ“Œ 0
Preview
Vision Fine‐Tunes Predictions of Bimanual Self‐Touch When we move to touch ourselves, our somatosensory perception is gradually attenuated due to the predictions of the internal forward models about the somatosensory consequences of our movements. Here...

Happy to share that our new paper has been published in the European Journal of Neuroscience (@ejneuroscience.bsky.social)! Using psychophysics, we show that vision fine-tunes self-touch predictions, leading to the temporal modulation of somatosensory perception during movements to self-touch.

24.02.2026 09:26 πŸ‘ 8 πŸ” 4 πŸ’¬ 5 πŸ“Œ 2
Preview
Motor prediction reduces beta-band power and enhances cerebellar-somatosensory connectivity before self-touch to enable its attenuation Prevailing theories suggest that the brain uses an internal forward model to predict tactile input during voluntary movements, thereby reducing the intensity of the reafferent tactile sensation, a phe...

πŸ“£ New preprint πŸ“£

The brain attenuates self-touch, but how does this unfold at the neural level before the touch? We used MEG to find out 🧠 πŸ‘‰πŸ‘ˆ

31.07.2025 10:02 πŸ‘ 21 πŸ” 12 πŸ’¬ 2 πŸ“Œ 2
Preview
Costs and benefits of temporal expectations on somatosensory perception and decision-making Our perception is shaped by prior expectations, including those about the timing of our sensations. These temporal expectations can be formed by recog…

New paper out, led by the brilliant (bsky-less) PhD student Ziliang Xiong!

We show that temporal expectations yield both costs and benefits on somatosensory perception and decision-making!

We used psychophysics and computational modelling!
Thread πŸ‘‡

www.sciencedirect.com/science/arti...

15.05.2025 14:31 πŸ‘ 5 πŸ” 3 πŸ’¬ 1 πŸ“Œ 0

Lastly, I want to give huge thanks to Xavier Job and Konstantina Kilteni (@kkilteni.bsky.social) who were an essential part of this big work!

21.02.2025 09:16 πŸ‘ 0 πŸ” 0 πŸ’¬ 1 πŸ“Œ 0

Taken together, our results show that sensorimotor predictions dynamically modulate somatosensory perception, and are in strong agreement with the framework of internal forward models.

21.02.2025 09:14 πŸ‘ 1 πŸ” 0 πŸ’¬ 0 πŸ“Œ 0
Post image

We replicated these results in Experiment 2 and demonstrated that if the movements did not generate expectations of self-touch, no gradual attenuation of somatosensory perception was present. This indicates that sensorimotor context is crucial in the temporal modulation of somatosensory perception.

21.02.2025 09:14 πŸ‘ 2 πŸ” 0 πŸ’¬ 0 πŸ“Œ 0
Post image

In Experiment 1, we show the temporal tuning of somatosensory perception during the movements to self-touch. The forces felt progressively weaker during the movement, reached their minimum perceived intensity at the time of self-touch, and recovered after the movement ended.

21.02.2025 09:13 πŸ‘ 2 πŸ” 0 πŸ’¬ 0 πŸ“Œ 0
Post image

The brain can predict and attenuate self-touch, but how do these predictions impact somatosensation before or after self-touch? We addressed this in behavioural tasks where participants discriminated forces applied to their left hand during the right hand’s reaching movement toward the left hand.

21.02.2025 09:13 πŸ‘ 1 πŸ” 0 πŸ’¬ 0 πŸ“Œ 0
Preview
Predictions of bimanual self-touch determine the temporal tuning of somatosensory perception We easily distinguish self-touch from the touch of others. This distinction is suggested to arise because the brain predicts the somatosensory consequ…

The first paper of my PhD is out in iScience (@cp-iscience.bsky.social)! Here, we show that somatosensory perception is dynamically modulated during the movement in a context-dependent manner.
#neuroskyence #psychscisky #Sensorimotor
www.sciencedirect.com/science/arti...

21.02.2025 09:00 πŸ‘ 10 πŸ” 3 πŸ’¬ 5 πŸ“Œ 2