The Musical Gestures Toolbox for Python is a collection of tools for visualizing and analysing audio and video files.
pip install musicalgesturesmusicalgestures installs its core Python dependencies automatically. You still need a working ffmpeg installation on your system for video processing.
import musicalgestures as mg
# Load a video
v = mg.MgVideo('dance.avi')
# Create visualizations
v.grid()
v.videograms()
v.average()
v.history()
# Perform motion analysis
v.motion()
# Audio analysis
v.audio.waveform()
v.audio.spectrogram()
v.audio.tempogram()
# Pose estimation
v.pose(model='body_25', device='cpu')ffmpegis required for video I/O and preprocessing.pose()downloads OpenPose weights on first use if they are missing.- In notebooks and other non-interactive runs, missing pose weights are downloaded automatically when possible.
- If
device='gpu'is requested but OpenCV CUDA support is unavailable,pose()falls back to CPU execution. flow.dense(),flow.sparse(), andblur_faces()use CPU by default (use_gpu=False). Setuse_gpu=Trueto opt into CUDA acceleration with automatic CPU fallback.get_cuda_device_count()is available to quickly check whether OpenCV sees CUDA devices.blur_faces()returns the generated result object consistently, including whensave_data=True.
- Video Analysis: Motion detection, optical flow, pose estimation
- Audio Processing: Spectrograms, audio descriptors, tempo analysis
- Visualizations: Motiongrams, videograms, motion history
- Integration: Works with NumPy, SciPy, and Matplotlib ecosystems
- Cross-platform: Linux, macOS, Windows support
See this short video presentation made for the Nordic Sound and Music Computing Conference 2021:
- Python 3.10+
- FFmpeg
- See installation guide for complete requirements
This toolbox builds on the Musical Gestures Toolbox for Matlab, which again builds on the Musical Gestures Toolbox for Max. Many researchers and research assistants have helped its development over the years, including Balint Laczko, Joachim Poutaraud, Frida Furmyr, Marcus Widmer, Alexander Refsum Jensenius
The software is currently maintained by the fourMs lab at RITMO Centre for Interdisciplinary Studies in Rhythm, Time and Motion at the University of Oslo.
If you use this toolbox in your research, please cite this article:
- Laczkó, B., & Jensenius, A. R. (2021). Reflections on the Development of the Musical Gestures Toolbox for Python. Proceedings of the Nordic Sound and Music Computing Conference, Copenhagen.
@inproceedings{laczkoReflectionsDevelopmentMusical2021,
title = {Reflections on the Development of the Musical Gestures Toolbox for Python},
author = {Laczkó, Bálint and Jensenius, Alexander Refsum},
booktitle = {Proceedings of the Nordic Sound and Music Computing Conference},
year = {2021},
address = {Copenhagen},
url = {http://urn.nb.no/URN:NBN:no-91935}
}This toolbox is released under the GNU General Public License 3.0 license.

