Motion Capture

These videos show our ongoing research collaboration with the CIRMMT  using the Qualisys motion capture system.
 First video shows a recording session (Michel Savail playing Saudade N.3 of R. Dyens). Second video shows the model obtained after processing a recording data (Michel Savail playing Leo Brouwer).


Hexaphonic Audio Analysis

These videos show the real-time capabilities of our analysis library using two different approaches. In the first video the library is used in OpenFrameworks whereas the second video shows its usage as a VST plugin. Both videos show a real-time music analysis of the same guitar recording (Josep Soto playing Bach). The left channel playing the original recording and the right channel plays the detected notes.

Artistic Experiment

The Master Student of the Pompeu Fabra University Heber Manuel Pérez Torres exploring artistic capabilities of the guitar prototype.

Recording Session

This video shows a recording session at ESMUC studio recording with our prototype (an array of capacitive sensors mounted on the fretboard of the guitar). Benjamí Abad is playing Blue Bossa.


Gesture detection

This video shows a real-time finger position detection using capacitive sensors described here. Each digit corresponds to the number of fingers pressing at the same fret. These positions can be played in different hand positions and in different strings. 6 refers to bar activation, 1 refers to 1 finger activation at any string, 2 refers to 2 finger activation at the same fret at any strings, and 3 refers to 3 finger activation at the same fret at any strings. Yellow bars and digits correspond to the detected position using a K-NN algorithm.

See "Analyzing left hand fingering in guitar playing" paper in Publications page.

Articulation detection

This video demonstrates our expressive articulation detection model. As seen on the video our algorithm works with audio input. For both F0 and onset detection we are using Aubio library. For each expressive articulation model, we use 40 different extracted expressive articulation. After that, we compare each articulation properties (right side) with the constructed models and classify the type of the expressive articulation. This comparison was demonstrated in the Model section (lower left) of the video. The Matlab demo will be available soon.

See "Legato and Glissando identification in Classical Guitar" paper in Publications page.

Kopèrnic - Interview for TV

This video shows an excerpt of an interview recorded for a local TV in Barcelona. Here, we explain the main goals of this research and show the prototype we used for our initial experiments. It is in catalan.

Image-based analysis of Gestures

This video shows the preliminary results of our research on using computer vision techniques to analyze performers gestures and fingering. We are using openCV and OpenFrameworks.