Abstract:

Musical rhythm and meter are characterized by simple proportional relationships between event durations within pieces, making comparison of rhythms between different musical pieces a nebulous practice, especially at different tempos. Though the "main tempo," or tactus, of a piece serves as an important cognitive reference point, it is difficult to identify objectively. In this paper, I investigate how statistical regularities in rhythmic patterns can be used to determine how to compare pieces at different tempos, speculating that these regularities could relate to the perception of tactus. Using a Bayesian statistical approach, I model first-order (two-gram) rhythmic event transitions in a symbolic dataset of rap transcriptions (MCFlow), allowing the model to renotate the rhythmic values of each transcription as needed to optimize fit. The resulting model predicts makes "renotations" which match a priori predictions from the original dataset's transcriber. I then demonstrate that the model can be used to rhythmically align new data, giving an objective basis for rhythmic annotation decisions.

Reviews

No reviews available

Back to Top

© 2024 International Society for Music Information Retrieval