Glad to hear! Too bad you’re not there this year, but you can still tell us how you’d like to use it right here:
docs.google.com/forms/d/e/1F...
That would be super valuable to us!
Glad to hear! Too bad you’re not there this year, but you can still tell us how you’d like to use it right here:
docs.google.com/forms/d/e/1F...
That would be super valuable to us!
@martager.bsky.social
📍Special Talk
📆 Mar 11, 13:00
To conclude, meet the Brain-Body Analysis Special Interest Group (BBSIG): a collaborative initiative to benchmark & standardize peripheral physio signals (ECG, PPG, RESP) preprocessing and analysis. Check our pipelines from v0.0.1! 🫀🫁🧠
👉 bbsig.de
Another poster at the #MindBrainBody Symposium. We need your input!
Come check out our work at the #MindBrainBody Symposium!
How the brain listens to the body matters.
Our new preprint investigates interoceptive processing in schizophrenia spectrum disorders across phenomenology, behavior, and heartbeat-evoked brain responses. 🧠🫀DOI: doi.org/10.64898/202...
Trying to build an experiment in Unity and slowly losing your patience?
Spoiler: that’s completely normal!
Meet EDIA - a modular framework for building studies in Unity.
🧩 Reusable modules
📊 Data sync
🕶️ Multi-headset support
And yes, a “Find Waldo” demo is included - because science should be fun!
Symposium 1.1, here we go!
To read more about AffectTracker, check out our latest publication: doi.org/10.3389/frvi...
@toninfrc.bsky.social @therealspr.bsky.social
Happy also to chat about our Brain-Body Analysis Special Interest Group (BBSIG) pipelines for preprocessing and analysing ECG, PPG and respiration (soon), openly available and ready-to-use with BIDS data as Jupyter Notebooks 🫀🫁
Work of +20 wonderful collaborators! ✨
📑 Documentation: www.bbsig.de
Villringer et al. Figure 1. Conceptual framework for brain–body states
Villringer et al. Figure 2 Brain–body micro-, meso-, and macro-states can be distinguished on the basis of their duration and reversibility
'Brain–body states as a link between cardiovascular and mental health'
by Arno Villringer, Vadim Nikulin & Michael Gaebler @mbe-lab.bsky.social @michaelgaebler.com @mpicbs.bsky.social sky.social
www.cell.com/trends/neuro...
Check out our new article for young readers (ages 8-15) on heart-brain interactions and interoception! 🧠🫀
I had so much fun co-writing this with @agatapatyczek.bsky.social @el-rei.bsky.social with the support of @michaelgaebler.com ✍️
👉 Share it widely with curious young minds
Yay for #scicomm ✨
Our studies confirmed AffectTracker is reliable, with high user experience and low interference. It opens new avenues for linking subjective experience to physiological dynamics. The tool is open-source and available on GitHub!
#OpenScience
AffectTracker allows users to continuously rate their valence and arousal during VR experiences. It features customizable feedback options, including a simplified affect grid and a novel abstract shape ("Flubber"), designed to be intuitive and minimally interfering.
👥An amazing team effort by:
@fra-malandrone.bsky.social
@lucyroe.bsky.social
A. Ciston
@thefirstfloor.bsky.social
A. Villringer
S. Carletto
@michaelgaebler.com
#neuroskyence #vr #emotion #affect #selfreports
📢Our peer-reviewed article about the AffectTracker is finally out! 😲🕹️📈
Traditional methods for rating emotion often miss the dynamic, moment-to-moment nature of feelings. We designed a tool to capture this continuous affective experience in real-time during dynamic emotional stimulation.
📣 We're at the #MindBrainBody Symposium in Berlin, starting today! Looking forward to connect with everyone and share our latest research 🧠
Our group has an exciting lineup of posters - come chat with us! 💬 Check out the previews below to see where and when to meet us 📌
#MBBS24 #neuroskyence
We centralized our open-science contributions in a new "Tools & Software" section on our website; check out
- open stimuli (eg. 3D objects)
- open data (eg. MindBrainBody)
- tools (eg. excite-o-meter, AffectTracker)
- analysis scripts
- & more
www.cbs.mpg.de/departments/...
#researchtransparency
The 1-min videos in study 1 are monoscopic, chosen as intermediate stimuli between static images and long videos to extend the classical short event-related stimulus approach. Also finding suitable free videos was challenging. Study 2's 23-min video is stereoscopic, a step further in stimuli type
3️⃣Tool offers a novel way to study affective dynamics with minimal interference, effectively capturing the nuances of subjective experiences. It opens new research opportunities to link affective states with physiological dynamics
🌟Stay tuned for the full paper & we welcome feedback & discussions! 💭
2️⃣Empirically evaluated in 2 studies at 2 sites (Berlin & Torino; N = 134) with both shorter 1-min 360° videos (low affective variability [AV] 〰️) and longer more dynamic 23-min stimulus (high AV 📈)
Both Grid & Flubber ➡️ high user experience 😃 & low interference with the affective experience itself
1️⃣Participants can rate in real-time and continuously, using the touchpad or joystick of a VR controller 🎮(here: HTC Vive Pro). It comprises three customizable feedback options: a simplified affect grid (Grid), an abstract pulsating variant (Flubber), and no visual feedback (Proprioceptive)
👥Together with F.Malandrone @lucyroe.bsky.social A.Ciston @thefirstfloor.bsky.social A.Villringer S.Carletto @michaelgaebler.com
🛠️ Unity prefab: github.com/afourcade/Af...
🚀 Preprint out! doi.org/10.31234/osf...
We developed, empirically evaluated and openly share **AffectTracker**, a new tool to collect continuous ratings of two-dimensional (valence and arousal) affective experience **during** dynamic emotional stimulation (e.g., 360° videos) in immersive VR! 🥽🧠🟦
I thought it could be nice to connect the community of researchers exploring body-brain interactions on bsky, so here is the Body-Brain Interactions Starter Pack! 🫀🫁👀🧠 #neuroskyence #academicsky
Let me know if you would like to be added or know someone to add. Enjoy and share!
go.bsky.app/Fwqeu32
Title: Real-time continuous rating of affective experience in immersive Virtual Reality
P.361 (Session 1)
@toninfrc.bsky.social
In collaboration with Torino University, we developed a fun and intuitive new tool to record moment-by-moment feelings!
We are coming to Psychologie und Gihirn 2024 (PUG) in Hamburg! Come chat with us! See the some teezers in the comments 💬 #PuG2024
Thank you!
The picture was made using the AI image generator DALL-E3
We contribute to shed light on the complex relationship between emotions & the nervous systems (or MindBrainBody coupling) under naturalistic stimulation. 🌟 Stay tuned for the full paper & we’re very happy about feedback and discussions! 💭
(Illustration: DALL-E3)
However, whole-brain exploratory analyses revealed a temporo-occipital cluster, where higher EA was linked to decreased 🧠➡️🫀 brain-to-heart (gamma→HF-HRV) and increased 🫀➡️🧠heart-to-brain (LF-HRV→gamma) information flow.
4️⃣ Physiological modeling (using a method by @diegocandiar and others) did not provide evidence for our hypothesis that higher EA changes the bidirectional information flow between HF-HRV & posterior alpha power. 🧠🔁🫀