A toolkit for symbolic music generation Hao-Wen Dong Ke Chen Julian McAuley Taylor Berg-Kirkpatrick
Why MusPy? MusPy Data Data Model Model Result collection preprocessing creation training analysis Machine learning library (e.g., PyTorch, TensorFlow)
Overview External libraries JSON YAML Objects Remote (in other music libraries) (e.g., music21, Unified • music21 • pretty_midi mido, pretty_midi, .json .yaml dataset • mido • Pypianoroll Pypianoroll) download to save load from Dataset management Representations Datasets MusPy to to parse Dataset Dataset Dataset • pitch -based • piano-roll • PyTorch dataset Music class from from • event -based • note -based • TensorFlow dataset Dataset Dataset read write External systems External libraries External (e.g., music notation softwares, .mid .mxl .abc (e.g., PyTorch, TensorFlow) datasets synthesizers, sequencers, DAWs) MIDI ABC MusicXML
muspy.Music Class metadata : schema_version : '0.0’ title : Für Elise creators : [Ludwig van Beethoven] collection : Example dataset source_filename : example.json resolution : 4 tempos : - { time : 0, qpm : 72.0} key_signatures : - { time : 0, root : 9, mode : minor} • Core class of MusPy time_signatures : - { time : 0, numerator : 3, denominator : 8} downbeats : [4, 16] lyrics : • A universal container for - { time : 0, lyric : Nothing but a lyric} annotations : - { time : 0, annotation : Nothing but an annotation} symbolic music tracks : - program : 0 is_drum : false name : Melody • Serializable to JSON/YAML notes : - { time : 0, duration : 2, pitch : 76, velocity : 64} - { time : 2, duration : 2, pitch : 75, velocity : 64} - { time : 4, duration : 2, pitch : 76, velocity : 64} - { time : 6, duration : 2, pitch : 75, velocity : 64} - { time : 8, duration : 2, pitch : 76, velocity : 64 - { time : 10, duration : 2, pitch : 71, velocity : 64} - { time : 12, duration : 2, pitch : 74, velocity : 64} - { time : 14, duration : 2, pitch : 72, velocity : 64} - { time : 16, duration : 2, pitch : 69, velocity : 64} lyrics : - { time : 0, lyric : Nothing but a lyric} annotations : - { time : 0, annotation : Nothing but an annotation}
I/O Interfaces JSON YAML .json .yaml MIDI MusicXML ABC .mid .mxl .abc muspy.Music object Objects (in other music libraries) • music21 • pretty_midi • mido • Pypianoroll Representations • pitch -based • piano-roll • event -based • note -based
Dataset Management Remote # Download and extract the dataset Source nes = muspy.NESMusicDatabase(root="data/nes/", dataset download_and_extract=True) # Convert the dataset to MusPy Music objects Converted nes.convert() dataset # Iterate over the dataset for music in nes: Music objects do_something(music) # Convert to a PyTorch dataset dataset = nes.to_pytorch_dataset(representation="pianoroll") Training data
Datasets (more coming soon!)
Result Analysis Tools pitch-related metrics rhythm-related metrics - pitch_range - n_pitches_used - empty_beat_rate Audio rendering - n_pitch_classes_used - empty_measure_rate - polyphony - drum_in_pattern_rate - polyphony_rate - drum_pattern_consistency - pitch_in_scale_rate - groove_consistency - scale_consistency - pitch_entropy - pitch_class_entropy piano-roll visualization score visualization
Dataset Analysis Tempo distributions Length distributions Key distributions
Experiments Cross-dataset generalizability Perplexities Perplexities vs dataset size
Thank you! pip install muspy Hao-Wen Dong, Ke Chen, Julian McAuley, Taylor Berg-Kirkpatrick University of California San Diego
Recommend
More recommend