Beat-Based Rhythm Quantization of MIDI Performances
Abstract
A transformer-based model for rhythm quantization that uses beat and downbeat information to align MIDI performances with metrically-structured scores, achieving superior performance on MUSTER metric.
We propose a transformer-based rhythm quantization model that incorporates beat and downbeat information to quantize MIDI performances into metrically-aligned, human-readable scores. We propose a beat-based preprocessing method that transfers score and performance data into a unified token representation. We optimize our model architecture and data representation and train on piano and guitar performances. Our model exceeds state-of-the-art performance based on the MUSTER metric.
Get this paper in your agent:
hf papers read 2508.19262 Don't have the latest CLI?
curl -LsSf https://hf.co/cli/install.sh | bash Models citing this paper 0
No model linking this paper
Datasets citing this paper 0
No dataset linking this paper
Spaces citing this paper 1
Collections including this paper 0
No Collection including this paper