Top left: Experimental paradigm. The authors analyzed EEG data recorded from 49 sleeping human newborns while being exposed to monophonic piano melodies composed by J. S. Bach (real condition) and control stimuli (shuffled condition). Top right: Surprise and entropy. Surprise and entropy associated with each note’s timing (green, St and Et, respectively) and pitch (yellow, Sp and Ep, respectively) were estimated using an unsupervised statistical learning model trained on all stimuli. Dot plots display mean surprise and entropy associated with real and shuffled music, averaged across melodies (left panel), and separately for each melody (right panel). Bottom: Analytical approach. Multivariate Temporal Response Function (mTRF) models were fit to describe the forward relationship between multiple stimulus features and the EEG signal. The full TRF model (leftmost panel) included acoustic low-level features (spectral flux, acoustic onset, IOI, and IPI) and high-level features (surprise and entropy of pitch and timing).
Does our very human ability to anticipate #musical structure exist at birth? @robertabianco.bsky.social @giacomonovembre.bsky.social &co show that #newborns encode #rhythmic (but not melodic) expectations based on statistical regularities in real #music @plosbiology.org plos.io/4kqKVWg