Abstract
Statistical learning underlies the generation of expectations with different degrees of uncertainty. In music, uncertainty applies to expectations for pitches in a melody. This uncertainty can be quantified by Shannon entropy from distributions of expectedness ratings for multiple continuations of each melody, as obtained with the probe-tone paradigm. We hypothesised that statistical learning of music can be modelled as a process of entropy reduction. Specifically, implicit learning of statistical regularities allows reduction in the relative entropy (i.e. symmetrised Kullback-Leibler Divergence) between listeners’ prior expectancy profiles and probability distributions of a musical style or of stimuli used in short-term experiments.
Five previous probe-tone experiments with musicians and non-musicians were revisited. In Experiments 1-2 participants rated expectedness for tonal melodies and Charlie Parker solos. Experiments 3-5 tested participants before and after 25-30 mins exposure to 5, 15 or 400 melodies generated from a finite-state grammar using the Bohlen-Pierce scale.
As predicted, we found between-participant differences in relative entropy corresponding to degree and relevance of musical training, and within-participant decreases in entropy after short-term statistical learning of novel music. Thus, whereas inexperienced listeners make high-entropy predictions, following the Principle of Maximum Entropy, statistical learning over varying timescales enables listeners to generate melodic expectations with reduced entropy. These findings are consistent with the Free-Energy Principle, which has been proposed as a unified theory for brain function.
Five previous probe-tone experiments with musicians and non-musicians were revisited. In Experiments 1-2 participants rated expectedness for tonal melodies and Charlie Parker solos. Experiments 3-5 tested participants before and after 25-30 mins exposure to 5, 15 or 400 melodies generated from a finite-state grammar using the Bohlen-Pierce scale.
As predicted, we found between-participant differences in relative entropy corresponding to degree and relevance of musical training, and within-participant decreases in entropy after short-term statistical learning of novel music. Thus, whereas inexperienced listeners make high-entropy predictions, following the Principle of Maximum Entropy, statistical learning over varying timescales enables listeners to generate melodic expectations with reduced entropy. These findings are consistent with the Free-Energy Principle, which has been proposed as a unified theory for brain function.
Original language | English |
---|---|
Publication date | 2015 |
Publication status | Published - 2015 |
Event | International Conference on Interdisciplinary Advances in Statistical Learning - Basque Center on Cognition, Brain, and Language, Ibaeta University, San Sebastian, Spain Duration: 18 Jun 2015 → 27 Jun 2015 |
Conference
Conference | International Conference on Interdisciplinary Advances in Statistical Learning |
---|---|
Location | Basque Center on Cognition, Brain, and Language, Ibaeta University |
Country/Territory | Spain |
City | San Sebastian |
Period | 18/06/2015 → 27/06/2015 |