Publications
-
Skogstad, Ståle Andreas van Dorp; Nymoen, Kristian; Høvin, Mats Erling; Holm, Sverre & Jensenius, Alexander Refsum
(2013).
Filtering Motion Capture Data for Real-Time Applications.
In Yeo, Woon Seung; Lee, Kyogu; Sigman, Alexander; Ji, Haru & Graham, Wakefield (Ed.),
Proceedings of the International Conference on New Interfaces For Musical Expression.
Korea Advance Institute of Science and Technology.
ISSN 2220-4792.
p. 142–147.
Full text in Research Archive
Show summary
In this paper we present some custom designed filters for real-time motion capture applications. Our target application is motion controllers, i.e. systems that interpret hand motion for musical interaction. In earlier research we found effective methods to design nearly optimal filters for realtime applications. However, to be able to design suitable filters for our target application, it is necessary to establish the typical frequency content of the motion capture data we want to filter. This will again allow us to determine a reasonable cutoff frequency for the filters. We have therefore conducted an experiment in which we recorded the hand motion of 20 subjects. The frequency spectra of these data together with a method similar to the residual analysis method were then used to determine reasonable cutoff frequencies. Based on this experiment, we propose three cutoff frequencies for different scenarios and filtering needs: 5, 10 and 15 Hz, which correspond to heavy, medium and light filtering, respectively. Finally, we propose a range of real-time filters applicable to motion controllers. In particular, low-pass filters and low-pass differentiators of degrees one and two, which in our experience are the most useful filters for our target application.
-
Skogstad, Ståle Andreas van Dorp; Holm, Sverre & Høvin, Mats Erling
(2012).
Digital IIR filters with minimal group delay for real-time applications.
In Hegazy, Yasser G.; Küehn, Paul J. & Khamis, Alaa (Ed.),
Engineering and Technology (ICET), 2012 International Conference on.
IEEE conference proceedings.
ISSN 978-1-4673-4808-9.
p. 1–6.
doi:
10.1109/icengtechnol.2012.6396136.
Show summary
In this paper we examine the potential for designing digital (IIR) filters with minimal group delay, which are relevant for real-time applications. By formulating filter design as a multi-objective optimization problem and approaching it with an unbiased metaheuristic search algorithm, we have established relationships between filter delay and other filter objectives. These relationships are presented as non-inferior surfaces for different filter orders and design approaches. We present possible designs that are realizable with (1) classical IIR design constructions, and (2) unconstrained global search for filter orders between 2 and 5. Elliptical (Cauer) filters are found as to have the highest potential for low group delay among the classical constructions. However, as one might expect, unconstrained IIR search discovers more optimal filters, but is limited to filter orders of ∼5. Currently, there exists no established method that can construct similar IIR filters with a group delay below n/2, where n is the given filter order. Finally, we present some unconstrained filter examples that we claim are nearly optimal.
-
Skogstad, Ståle Andreas van Dorp; Holm, Sverre & Høvin, Mats Erling
(2012).
Designing Digital IIR Low-Pass Differentiators With Multi-Objective Optimization.
In Baozong, Yuan; Qiuqi, Ruan & Xiaofang, Tang (Ed.),
Proceedings of 2012 IEEE 11th International Conference on Signal Processing.
IEEE Press.
ISSN 978-1-4673-2194-5.
doi:
10.1109/icosp.2012.6491617.
-
Godøy, Rolf Inge; Jensenius, Alexander Refsum; Voldsund, Arve; Glette, Kyrre Harald; Høvin, Mats Erling & Nymoen, Kristian
[Show all 7 contributors for this article]
(2012).
Classifying Music-Related Actions.
In Cambouropoulos, Emilios; Tsougras, Costas; Mavromatis, Panayotis & Pastiadis, Konstantinos (Ed.),
Proceedings of the ICMPC-ESCOM 2012 Joint Conference: 12th Biennial International Conference for Music Perception and Cognition, 8th Triennial Conference of the European Society for the Cognitive Sciences of Music.
School of Music Studies, Aristotle University of Thessaloniki Thessaloniki, Hellas.
ISSN 978-960-99845-1-5.
p. 352–357.
Full text in Research Archive
Show summary
Our research on music-related actions is based on the conviction that sensations of both sound and body motion are inseparable in the production and perception of music. The expression "musicrelated actions" is here used to refer to chunks of combined sound and body motion, typically in the duration range of approximately 0.5 to 5 seconds. We believe that chunk-level music-related actions are highly significant for the experience of music, and we are presently working on establishing a database of music-related actions in order to facilitate access to, and research on, our fast growing collection of motion capture data and related material. In this work, we are confronted with a number of perceptual, conceptual and technological issues regarding classification of music-related actions, issues that will be presented and discussed in this paper.
-
Jensenius, Alexander Refsum; Nymoen, Kristian; Skogstad, Ståle Andreas van Dorp & Voldsund, Arve
(2012).
A Study of the Noise-Level in Two Infrared Marker-Based Motion Capture Systems.
In Sturm, Bob L.; Dahl, Sofia & Larsen, Jan (Ed.),
Proceedings of the 9th Sound and Music Computing Conference - "Illusions".
Logos Verlag Berlin.
ISSN 9783832531805.
p. 258–263.
Full text in Research Archive
Show summary
With musical applications in mind, this paper reports on the level of noise observed in two commercial infrared marker-based motion capture systems: one high-end (Qualisys) and one affordable (OptiTrack). We have tested how various features (calibration volume, marker size, sampling frequency, etc.) influence the noise level of markers lying still, and fixed to subjects standing still. The conclusion is that the motion observed in humans standing still is usually considerably higher than the noise level of the systems. Dependent on the system and its calibration, however, the signal-to-noise-ratio may in some cases be problematic.
-
Skogstad, Ståle Andreas van Dorp; Nymoen, Kristian; de Quay, Yago & Jensenius, Alexander Refsum
(2012).
Developing the Dance Jockey system for musical interaction with the Xsens MVN suit.
In Essl, Georg; Gillespie, Brent; Gurevich, Michael & O’Modhrain, Sile (Ed.),
Proceedings of the International Conference on New Interfaces for Musical Expression.
University of Michigan Press.
ISSN 978-0-9855720-1-3.
p. 226–229.
Full text in Research Archive
Show summary
In this paper we present the Dance Jockey System, a system developed for using a full body inertial motion capture suit (Xsens MVN) in music/dance performances. We present different strategies for extracting relevant postures and actions from the continuous data, and how these postures and actions can be used to control sonic and musical features. The system has been used in several public performances, and we believe it has great potential for further exploration. However, to overcome the current practical and technical challenges when working with the system, it is important to further refine tools and software in order to facilitate making of new performance pieces.
-
Nymoen, Kristian; Voldsund, Arve; Skogstad, Ståle Andreas van Dorp; Jensenius, Alexander Refsum & Tørresen, Jim
(2012).
Comparing Motion Data from an iPod Touch to a High-End Optical Infrared Marker-Based Motion Capture System.
In Essl, Georg; Gillespie, Brent; Gurevich, Michael & O’Modhrain, Sile (Ed.),
Proceedings of the International Conference on New Interfaces for Musical Expression.
University of Michigan Press.
ISSN 978-0-9855720-1-3.
p. 88–91.
Full text in Research Archive
Show summary
The paper presents an analysis of the quality of motion data from an iPod Touch (4th gen.). Acceleration and orientation data derived from internal sensors of an iPod is compared to data from a high end optical infrared marker-based motion capture system (Qualisys) in terms of latency, jitter, accuracy and precision. We identify some rotational drift in the iPod, and some time lag between the two systems. Still, the iPod motion data is quite reliable, especially for describing relative motion over a short period of time.
-
de Quay, Yago; Skogstad, Ståle Andreas van Dorp & Jensenius, Alexander Refsum
(2011).
Dance Jockey: Performing Electronic Music by Dancing.
Leonardo Music Journal.
ISSN 0961-1215.
21,
p. 11–12.
doi:
10.1162/LMJ_a_00052.
Full text in Research Archive
Show summary
The authors present an experimental musical performance called Dance Jockey, wherein sounds are controlled by sensors on the dancer's body. These sensors manipulate music in real time by acquiring data about body actions and transmitting the information to a control unit that makes decisions and gives instructions to audio software. The system triggers a broad range of music events and maps them to sound effects and musical parameters such as pitch, loudness and rhythm.
-
Skogstad, Ståle Andreas van Dorp; Nymoen, Kristian; de Quay, Yago & Jensenius, Alexander Refsum
(2011).
OSC Implementation and Evaluation of the Xsens MVN suit.
In Jensenius, Alexander Refsum; Tveit, Anders; Godøy, Rolf Inge & Overholt, Dan (Ed.),
Proceedings of the International Conference on New Interfaces for Musical Expression.
Universitetet i Oslo.
ISSN 978-82-991841-6-8.
p. 300–303.
Full text in Research Archive
Show summary
The International Conference on New Interfaces for Musical Expression (NIME) is an annual interdisciplinary conference gathering 200-500 participants from all over the world to share their knowledge and late-breaking work on new musical interface design. The NIME conference started out as a workshop at the Conference on Human Factors in Computing Systems (CHI) in 2001, and has grown into one of the largest and most vital international conferences within the field of music technology.
-
Nymoen, Kristian; Skogstad, Ståle Andreas van Dorp & Jensenius, Alexander Refsum
(2011).
SoundSaber - A Motion Capture Instrument.
In Jensenius, Alexander Refsum; Tveit, Anders; Godøy, Rolf Inge & Overholt, Dan (Ed.),
Proceedings of the International Conference on New Interfaces for Musical Expression.
Universitetet i Oslo.
ISSN 978-82-991841-6-8.
p. 312–315.
Full text in Research Archive
Show summary
The paper presents the SoundSaber - a musical instrument based on motion capture technology. We present technical details of the instrument and discuss the design development process. The SoundSaber may be used as an example of how high-fidelity motion capture equipment can be used for prototyping musical instruments, and we illustrate this with an example of a low-cost implementation of our motion capture instrument.
-
Skogstad, Ståle Andreas van Dorp; Nymoen, Kristian & Høvin, Mats Erling
(2011).
Comparing Inertial and Optical MoCap Technologies for Synthesis Control.
In Zanolla, Serena; Avanzini, Frederico; Canazza, Sergio & de Götzen, Amalia (Ed.),
Proceedings of SMC 2011 8th Sound and Music Computing Conference “Creativity rethinks science”.
Padova University Press.
ISSN 9788897385035.
p. 421–426.
Full text in Research Archive
Show summary
This paper compares the use of two different technologies for controlling sound synthesis in real time: the infrared marker-based motion capture system OptiTrack and Xsens MVN, an inertial sensor-based motion capture suit. We present various quantitative comparisons between the data from the two systems and results from an experiment where a musician performed simple musical tasks with the two systems. Both systems are found to have their strengths and weaknesses, which we will present and discuss.
-
Jensenius, Alexander Refsum; Glette, Kyrre Harald; Godøy, Rolf Inge; Høvin, Mats Erling; Nymoen, Kristian & Skogstad, Ståle Andreas van Dorp
(2010).
fourMs, University of Oslo – Lab Report.
In Rowe, Robert & Samaras, Dimitris (Ed.),
Proceedings of the International Computer Music Conference, June 1-5 2010, New York.
International Computer Music Association.
ISSN 0-9713192-8-6.
p. 290–293.
Full text in Research Archive
Show summary
The paper reports on the development and activities in the recently established fourMs lab (Music, Mind, Motion, Machines) at the University of Oslo, Norway. As a meeting place for researchers in music and informatics, the fourMs lab is centred around studies of basic issues in music cognition, machine learning and robotics.
-
Skogstad, Ståle Andreas van Dorp; Jensenius, Alexander Refsum & Nymoen, Kristian
(2010).
Using IR Optical Marker Based Motion Capture for Exploring Musical Interaction.
In Beilharz, Kirsty; Johnston, Andrew; Ferguson, Sam & Chen, Amy Yi-Chun (Ed.),
NIME 2010 proceedings: New Interfaces for Musical Expression++, 15-18th June 2010, University of Technology Sydney.
Association for Computing Machinery (ACM).
ISSN 978-0-646-53482-4.
p. 407–410.
Full text in Research Archive
Show summary
The paper presents a conceptual overview of how optical infrared marker based motion capture systems (IrMoCap) can be used in musical interaction. First we present a review of related work of using IrMoCap for musical control. This is followed by a discussion of possible features which can be exploited. Finally, the question of mapping movement features to sound features is presented and discussed.
-
Nymoen, Kristian; Jensenius, Alexander Refsum; Tørresen, Jim; Glette, Kyrre Harald & Skogstad, Ståle Andreas van Dorp
(2010).
Searching for Cross-Individual Relationships between Sound and Movement Features using an SVM Classifier.
In Beilharz, Kirsty; Johnston, Andrew; Ferguson, Sam & Chen, Amy Yi-Chun (Ed.),
NIME 2010 proceedings: New Interfaces for Musical Expression++, 15-18th June 2010, University of Technology Sydney.
Association for Computing Machinery (ACM).
ISSN 978-0-646-53482-4.
p. 259–262.
Full text in Research Archive
Show summary
In this paper we present a method for studying relationships between features of sound and features of movement. The method has been tested by carrying out an experiment with people moving an object in space along with short sounds. 3D position data of the object was recorded and several features were calculated from each of the recordings. These features were provided as input to a classifier which was able to classify the recorded actions satisfactorily, particularly when taking into account that the only link between the actions performed by the different subjects was the sound they heard while making the action.
-
-
Skogstad, Ståle Andreas van Dorp & Høvin, Mats Erling
(2006).
A Proposed Stability Characterization and Verification Method for High-Order Single-Bit Delta-Sigma Modulators.
In Johansson, Thomas (Eds.),
Proceedings 24th Norchip Conference.
IEEE conference proceedings.
ISSN 1-4244-0772-9.
p. 100–103.
View all works in Cristin
-
Skogstad, Ståle Andreas van Dorp; Nymoen, Kristian; Høvin, Mats Erling; Holm, Sverre & Jensenius, Alexander Refsum
(2013).
Filtering Motion Capture Data for Real-Time Applications.
Show summary
In this paper we present some custom designed filters for real-time motion capture applications. Our target application is motion controllers, i.e. systems that interpret hand motion for musical interaction. In earlier research we found effective methods to design nearly optimal filters for realtime applications. However, to be able to design suitable filters for our target application, it is necessary to establish the typical frequency content of the motion capture data we want to filter. This will again allow us to determine a reasonable cutoff frequency for the filters. We have therefore conducted an experiment in which we recorded the hand motion of 20 subjects. The frequency spectra of these data together with a method similar to the residual analysis method were then used to determine reasonable cutoff frequencies. Based on this experiment, we propose three cutoff frequencies for different scenarios and filtering needs: 5, 10 and 15 Hz, which correspond to heavy, medium and light filtering, respectively. Finally, we propose a range of real-time filters applicable to motion controllers. In particular, low-pass filters and low-pass differentiators of degrees one and two, which in our experience are the most useful filters for our target application.
-
Skogstad, Ståle Andreas van Dorp; Holm, Sverre & Høvin, Mats Erling
(2012).
Designing Digital IIR Low-Pass Differentiators With Multi-Objective Optimization.
-
Skogstad, Ståle Andreas van Dorp; Holm, Sverre & Høvin, Mats Erling
(2012).
Digital IIR Filters With Minimal Group Delay for Real-Time Applications.
Show summary
In this paper we examine the potential for designing digital (IIR) filters with minimal group delay, which are relevant for real-time applications. By formulating filter design as a multi-objective optimization problem and approaching it with an unbiased metaheuristic search algorithm, we have established relationships between filter delay and other filter objectives. These relationships are presented as non-inferior surfaces for different filter orders and design approaches. We present possible designs that are realizable with (1) classical IIR design constructions, and (2) unconstrained global search for filter orders between 2 and 5. Elliptical (Cauer) filters are found as to have the highest potential for low group delay among the classical constructions. However, as one might expect, unconstrained IIR search discovers more optimal filters, but is limited to filter orders of ∼5. Currently, there exists no established method that can construct similar IIR filters with a group delay below n/2, where n is the given filter order. Finally, we present some unconstrained filter examples that we claim are nearly optimal.
-
Godøy, Rolf Inge; Jensenius, Alexander Refsum; Voldsund, Arve; Glette, Kyrre Harald; Høvin, Mats Erling & Nymoen, Kristian
[Show all 7 contributors for this article]
(2012).
Classifying Music-Related Actions.
Show summary
Our research on music-related actions is based on the conviction that sensations of both sound and body motion are inseparable in the production and perception of music. The expression "music-related actions" is here used to refer to chunks of combined sound and body motion, typically in the duration range of approximately 0.5 to 5 se¬conds. We believe that chunk-level music-related ac¬tions are highly signifi¬cant for the experience of music, and we are pres¬ently working on establishing a database of music-related actions in order to facilitate access to, and research on, our fast growing collection of motion capture data and related material. In this work, we are con¬fronted with a number of perceptual, concep¬tual and technological is¬sues regarding classification of music-related ac¬tions, issues that will be presented and discussed in this paper.
-
Jensenius, Alexander Refsum; Nymoen, Kristian; Skogstad, Ståle Andreas van Dorp & Voldsund, Arve
(2012).
A Study of the Noise-Level in Two Infrared Marker-Based Motion Capture Systems.
Show summary
With musical applications in mind, this paper reports on the level of noise observed in two commercial infrared marker-based motion capture systems: one high-end (Qualisys) and one affordable (OptiTrack). We have tested how various features (calibration volume, marker size, sampling frequency, etc.) influence the noise level of markers lying still, and fixed to subjects standing still. The conclusion is that the motion observed in humans standing still is usually considerably higher than the noise level of the systems. Dependent on the system and its calibration, however, the signal-to-noise-ratio may in some cases be problematic.
-
Skogstad, Ståle Andreas van Dorp; Nymoen, Kristian; de Quay, Yago & Jensenius, Alexander Refsum
(2012).
Developing the Dance Jockey System for Musical Interaction with the Xsens MVN Suit.
Show summary
In this paper we present the Dance Jockey System, a system developed for using a full body inertial motion capture suit (Xsens MVN) in music/dance performances. We present different strategies for extracting relevant postures and actions from the continuous data, and how these postures and actions can be used to control sonic and musical features. The system has been used in several public performances, and we believe it has great potential for further exploration. However, to overcome the current practical and technical challenges when working with the system, it is important to further refine tools and software in order to facilitate making of new performance pieces.
-
Nymoen, Kristian; Voldsund, Arve; Skogstad, Ståle Andreas van Dorp; Jensenius, Alexander Refsum & Tørresen, Jim
(2012).
Comparing Motion Data from an iPod Touch to a High-End Optical Infrared Marker-Based Motion Capture System.
Show summary
The paper presents an analysis of the quality of motion data from an iPod Touch (4th gen.). Acceleration and orientation data derived from internal sensors of an iPod is compared to data from a high end optical infrared marker-based motion capture system (Qualisys) in terms of latency, jitter, accuracy and precision. We identify some rotational drift in the iPod, and some time lag between the two systems. Still, the iPod motion data is quite reliable, especially for describing relative motion over a short period of time.
-
Skogstad, Ståle Andreas van Dorp; Nymoen, Kristian; de Quay, Yago & Jensenius, Alexander Refsum
(2011).
OSC Implementation and Evaluation of the Xsens MVN suit.
-
Nymoen, Kristian; Skogstad, Ståle Andreas van Dorp & Jensenius, Alexander Refsum
(2011).
SoundSaber - A Motion Capture Instrument.
Show summary
The paper presents the SoundSaber - a musical instrument based on motion capture technology. We present technical details of the instrument and discuss the design development process. The SoundSaber may be used as an example of how high-fidelity motion capture equipment can be used for prototyping musical instruments, and we illustrate this with an example of a low-cost implementation of our motion capture instrument.
-
Skogstad, Ståle Andreas van Dorp & de Quay, Yago
(2011).
Dance Jockey.
-
Skogstad, Ståle Andreas van Dorp & de Quay, Yago
(2011).
Where Art Thou? Dance Jockey.
-
Burger, Birgitta; Nymoen, Kristian; Skogstad, Ståle Andreas van Dorp & Voldsund, Arve
(2011).
Optical Motion Capture Technology Workshop.
-
Skogstad, Ståle Andreas van Dorp
(2010).
Using Full Body Motion Capture Data as Input for Musical Sound Generation.
Show summary
This poster presents interdisciplinary research on how one can use full body motion capture technology for musical control.
-
-
Skogstad, Ståle Andreas van Dorp & de Quay, Yago
(2010).
Dance Jockey.
Show summary
Dance Jockey for Xsens motion capture suite & realtime sound synthesis. Performed 25 August 2010, Department of Musicology, University of Oslo. Developed and performed by Yago de Quay & Ståle Skogstad.
-
Nymoen, Kristian; Jensenius, Alexander Refsum; Tørresen, Jim; Skogstad, Ståle Andreas van Dorp & Glette, Kyrre Harald
(2010).
Searching for Cross-Individual Relationships between Sound and Movement Features using an SVM Classifier.
Show summary
In this paper we present a method for studying relationships between features of sound and features of movement. The method has been tested by carrying out an experiment with people moving an object in space along with short sounds. 3D position data of the object was recorded and several features were calculated from each of the recordings. These features were provided as input to a classifier which was able to classify the recorded actions satisfactorily, particularly when taking into account that the only link between the actions performed by the different subjects was the sound they heard while making the action.
-
Jensenius, Alexander Refsum; Glette, Kyrre Harald; Godøy, Rolf Inge; Høvin, Mats Erling; Nymoen, Kristian & Skogstad, Ståle Andreas van Dorp
(2010).
fourMs, University of Oslo – Lab Report.
-
Skogstad, Ståle Andreas van Dorp; Jensenius, Alexander Refsum & Nymoen, Kristian
(2010).
Using IR Optical Marker Based Motion Capture for Exploring Musical Interaction.
Show summary
The paper presents a conceptual overview of how optical infrared marker based motion capture systems (IrMoCap) can be used in musical interaction. First we present a review of related work of using IrMoCap for musical control. This is followed by a discussion of possible features which can be exploited. Finally, the question of mapping movement features to sound features is presented and discussed.
-
Herbelin, Bruno; Hansen, Hans-Ole; Jensenius, Alexander Refsum & Skogstad, Ståle Andreas van Dorp
(2009).
SUM biosensor.
-
Glette, Kyrre Harald; Tørresen, Jim & Skogstad, Ståle Andreas van Dorp
(2009).
Machine-learning techniques for action-sound MIR analysis and classification.
-
Jensenius, Alexander Refsum; Skogstad, Ståle Andreas van Dorp; Nymoen, Kristian; Godøy, Rolf Inge; Tørresen, Jim & Høvin, Mats Erling
(2009).
Reduced displays of multidimensional motion capture data sets of musical performance.
Show summary
Background: Carrying out research in the field of music and movement involves working with different types of data (e.g. motion capture and sensor data) and media (i.e. audio, video), each having its own size, dimensions, speed etc. While each of the data types and media have their own analytical tools and representation techniques, we see the need for developing more tools that allow for studying all the data and media together in a synchronised manner. We have previously developed solutions for studying musical sound and movement in parallel by using synchronised spectrograms of audio and motiongrams of video. Now as we have started using an infrared motion capture system in our research, we see the need for better visualisation techniques of the highly multidimensional data sets being recorded (e.g. 50 markers x 3 dimensions x 100 Hz). While there are several techniques for doing this independently of audio and video, we are working on tools that integrate well with our displays of spectrograms and motiongrams.
Aims: Creating reduced representations of multidimensional motion capture data of complex music-related body movement that can be used together with spectrograms and motiongrams.
Results/Main Contribution: We present some of the visualisation techniques we have been developing to display multidimensional data sets: 1) reduction based on collapsing dimensions, 2) reduction based on frame differencing, 3) colour coding of movement features. We show how these techniques allow for displaying reduced displays of multidimensional motion capture data sets synchronised with spectrograms and motiongrams.
Conclusions/Implications: The techniques presented allows for studying relationships between movement and sound in music performance, and make it possible to create visual displays of movement and sound that can be used on screen and in printed documents.
-
Høvin, Mats Erling; Jensenius, Alexander Refsum; Nymoen, Kristian & Skogstad, Ståle Andreas van Dorp
(2008).
Konsert med Oslo Laptop-orkester, dirigent: roboten "Anna".
View all works in Cristin
Published
Aug. 30, 2017 1:08 PM
- Last modified
Aug. 30, 2017 3:05 PM