Papers
arxiv:2508.19262

Beat-Based Rhythm Quantization of MIDI Performances

Published on Aug 18, 2025
Authors:
,
,

Abstract

A transformer-based model for rhythm quantization that uses beat and downbeat information to align MIDI performances with metrically-structured scores, achieving superior performance on MUSTER metric.

AI-generated summary

We propose a transformer-based rhythm quantization model that incorporates beat and downbeat information to quantize MIDI performances into metrically-aligned, human-readable scores. We propose a beat-based preprocessing method that transfers score and performance data into a unified token representation. We optimize our model architecture and data representation and train on piano and guitar performances. Our model exceeds state-of-the-art performance based on the MUSTER metric.

Community

Sign up or log in to comment

Get this paper in your agent:

hf papers read 2508.19262
Don't have the latest CLI?
curl -LsSf https://hf.co/cli/install.sh | bash

Models citing this paper 0

No model linking this paper

Cite arxiv.org/abs/2508.19262 in a model README.md to link it from this page.

Datasets citing this paper 0

No dataset linking this paper

Cite arxiv.org/abs/2508.19262 in a dataset README.md to link it from this page.

Spaces citing this paper 1

Collections including this paper 0

No Collection including this paper

Add this paper to a collection to link it from this page.