Tags:
Motion analysis,
Motion capture,
Audio analysis,
Music Information Retrieval,
Machine Learning
Publications
-
Lartillot, Olivier; Nymoen, Kristian; Câmara, Guilherme Schmidt & Danielsen, Anne (2021). Computational localization of attack regions through a direct observation of the audio waveform. Journal of the Acoustical Society of America.
ISSN 0001-4966.
149(1), s 723- 736 . doi:
10.1121/10.0003374
Show summary
This article addresses the computational estimation of attack regions in audio recordings. Previous attempts to do so were based on the reduction of the audio waveform into an envelope curve, which decreases its temporal resolution. The proposed approach detects the attack region directly from the audio waveform. The attack region is modeled as a line starting from a low-amplitude point and intersecting one of the local maxima according to two principles: (1) maximizing the slope, while favoring, at the same time, a higher peak if the slope remains only slightly lower and (2) dismissing initial attack regions of relatively low amplitude. The attack start position is fine-tuned by intersecting the attack slope with the audio waveform. The proposed method precisely pinpoints the attack region in cases where it is unambiguously observable from the waveform itself. In such cases, previous methods selected a broader attack region due to the loss of temporal resolution. When attack regions are less evident, the proposed method’s estimation remains within the range of results provided by other methods. Applied to the prediction of judgments of P-center localization [Danielsen, Nymoen, Anderson, C^amara, Langerød, Thompson, and London, J. Exp. Psychol. Hum. Percept. Perform. 45, 402–418 (2019)], the proposed method shows a significant increase in precision, at the expense of recall.
-
Câmara, Guilherme Schmidt; Nymoen, Kristian; Lartillot, Olivier & Danielsen, Anne (2020). Effects of instructed timing on electric guitar and bass sound in groove performance. Journal of the Acoustical Society of America.
ISSN 0001-4966.
147(2), s 1028- 1041 . doi:
10.1121/10.0000724
Show summary
This paper reports on two experiments that investigated the expressive means through which musicians well versed in groove-based music signal the intended timing of a rhythmic event. Data were collected from 21 expert electric guitarists and 21 bassists, who were instructed to perform a simple rhythmic pattern in three different timing styles—“laidback,” “on-the-beat,” and “pushed”—in tandem with a metronome. As expected, onset and peak timing locations corresponded to the instructed timing styles for both instruments. Regarding sound, results for guitarists revealed systematic differences across participants in the duration and brightness [spectral centroid (SC)] of the guitar strokes played using these different timing styles. In general, laid-back strokes were played with a longer duration and a lower SC relative to on-the-beat and pushed strokes. Results for the bassists indicated systematic differences in intensity (sound-pressure level): pushed strokes were played with higher intensity than on-the-beat and laid-back strokes. These results lend further credence to the hypothesis that both temporal and sound-related features are important indications of the intended timing of a rhythmic event, and together these features offer deeper insight into the ways in which musicians communicate at the microrhythmic level in groove-based music.
-
Câmara, Guilherme Schmidt; Nymoen, Kristian; Lartillot, Olivier & Danielsen, Anne (2020). Timing Is Everything... Or Is It? Effects of Instructed Timing Style, Reference and Pattern on Drum Kit Sound in Groove-Based Performance. Music Perception.
ISSN 0730-7829.
38(1), s 1- 26 . doi:
10.1525/mp.2020.38.1.1
Full text in Research Archive.
Show summary
THIS STUDY REPORTS ON AN EXPERIMENT THAT tested whether drummers systematically manipulated not only onset but also duration and/or intensity of strokes in order to achieve different timing styles. Twenty-two professional drummers performed two patterns (a simple ‘‘back-beat’’ and a complex variation) on a drum kit (hi-hat, snare, kick) in three different timing styles (laid-back, pushed, on-beat), in tandem with two timing references (metronome and instrumental backing track). As expected, onset location corresponded to the instructed timing styles for all instruments. The instrumental reference led to more pronounced timing profiles than the metronome (pushed strokes earlier, laid-back strokes later). Also, overall the metronome reference led to earlier mean onsets than the instrumental reference, possibly related to the ‘‘negative mean asynchrony’’ phenomenon. Regarding sound, results revealed systematic differences across participants in the duration (snare) and intensity (snare and hi-hat) of strokes played using the different timing styles. Pattern also had an impact: drummers generally played the rhythmically more complex pattern 2 louder than the simpler pattern 1 (snare and kick). Overall, our results lend further evidence to the hypothesis that both temporal and sound-related features contribute to the indication of the timing of a rhythmic event in groove-based performance.
-
Becker, Artur; Herrebrøden, Henrik; Gonzalez Sanchez, Victor Evaristo; Nymoen, Kristian; Dal Sasso Freitas, Carla Maria; Tørresen, Jim & Jensenius, Alexander Refsum (2019). Functional Data Analysis of Rowing Technique Using Motion Capture Data, In Grisha Coleman (ed.),
Proceedings of the 6th International Conference on Movement and Computing.
ACM Publications.
ISBN 978-1-4503-7654-9.
Article 12.
Show summary
We present an approach to analyzing the motion capture data of rowers using bivariate functional principal component analysis (bfPCA). The method has been applied on data from six elite rowers rowing on an ergometer. The analyses of the upper and lower body coordination during the rowing cycle revealed significant differ- ences between the rowers, even though the data was normalized to account for differences in body dimensions. We make an argument for the use of bfPCA and other functional data analysis methods for the quantitative evaluation and description of technique in sports.
-
Danielsen, Anne; Nymoen, Kristian; Anderson, Evan; Câmara, Guilherme Schmidt; Langerød, Martin Torvik; Thompson, Marc R. & London, Justin (2019). Where is the beat in that note? Effects of attack, duration, and frequency on the perceived timing of musical and quasi-musical sounds. Journal of Experimental Psychology: Human Perception and Performance.
ISSN 0096-1523.
45(3), s 402- 418 . doi:
10.1037/xhp0000611
Show summary
The perceptual center (P-center) of a sound is typically understood as the specific moment at which it is perceived to occur. Using matched sets of real and artificial musical sounds as stimuli, we probed the influence of attack (rise time), duration, and frequency (center frequency) on perceived P-center location and P-center variability. Two different methods to determine the P-centers were used: Clicks aligned in-phase with the target sounds via the method of adjustment, and tapping in synchrony with the target sounds. Attack and duration were the primary cues for P-center location and P-center variability; P-center variability was found to be a useful measure of P-center shape. Consistent interactions between attack and duration were also found. Probability density distributions for each stimulus display a systematic pattern of P-center shapes ranging from narrow peaks close to the onset of sounds with fast attack and short duration, to wider and flatter shapes indicating a range synchronization points for sounds with slow attack and long duration. The results support the conception of P-centers as not simple time points, but "beat bins" with characteristic shapes, and the shapes and locations of these beat bins are dependent upon both the stimulus and the synchronization task.
-
London, Justin; Nymoen, Kristian; Langerød, Martin Torvik; Thompson, Marc Richard; Code, David Løberg & Danielsen, Anne (2019). A comparison of methods for investigating the perceptual center of musical sounds. Attention, Perception & Psychophysics.
ISSN 1943-3921.
81(6), s 2088- 2101 . doi:
10.3758/s13414-019-01747-y
Full text in Research Archive.
Show summary
In speech and music, the acoustic and perceptual onset(s) of a sound are usually not congruent with its perceived temporal location. Rather, these "P-centers" are heard some milliseconds after the acoustic onset, and a variety of techniques have been used in speech and music research to find them. Here we report on a comparative study that uses various forms of the method of adjustment (aligning a click or filtered noise in-phase or anti-phase to a repeated target sound), as well as tapping in synchrony with a repeated target sound. The advantages and disadvantages of each method and probe type are discussed, and then all methods are tested using a set of musical instrument sounds that systematically vary in terms of onset/rise time (fast vs. slow), duration (short vs. long), and center frequency (high vs. low). For each method, the dependent variables were (a) the mean P-center location found for each stimulus type, and (b) the variability of the mean P-center location found for each stimulus type. Interactions between methods and stimulus categories were also assessed. We show that (a) in-phase and anti-phase methods of adjustment produce nearly identical results, (b) tapping vs. click alignment can provide different yet useful information regarding P-center locations, (c) the method of adjustment is sensitive to different sounds in terms of variability while tapping is not, and (d) using filtered noise as an alignment probe yields consistently earlier probe-onset locations in comparison to using a click as a probe.
-
Wallace, Benedikte; Martin, Charles Patrick & Nymoen, Kristian (2019). Tracing from Sound to Movement with Mixture Density Recurrent Neural Networks, In Grisha Coleman (ed.),
Proceedings of the 6th International Conference on Movement and Computing.
ACM Publications.
ISBN 978-1-4503-7654-9.
Artikkel.
Show summary
In this work, we present a method for generating sound-tracings using a mixture density recurrent neural network (MDRNN). A sound-tracing is a rendering of perceptual qualities of short sound objects through body motion. The model is trained on a dataset of single point sound-tracings with multimodal input data and learns to generate novel tracings. We use a second neural network classifier to show that the input sound can be identified from generated tracings. This is part of an ongoing research effort to examine the complex correlations between sound and movement and the possibility of modelling these relationships using deep learning.
-
Watne, Åshild & Nymoen, Kristian (2018). Der æ so vent å vestoheio. Intonasjon i et gammelstev fra Setesdal.. Musikk og tradisjon.
ISSN 1892-0772.
32, s 7- 30 Full text in Research Archive.
Show summary
The intonation patterns in traditional Norwegian folk songs have been described and measured in various ways for more than a hundred years. This article provides a historical summary of research in this area and introduces a new software for measuring tone heights. This is exemplified through our analysis of an unaccompanied folk song, "Der æ so vent å vestoheio", recorded by the Norwegian Broadcasting Corporation (NRK) in 1951, performed by Gro Heddi Brokke (1910–1997) from the valley of Setesdal. The 5th and octave scale degrees stand out as the most stable throughout the tune, with a lot of variation in the thirds, sixths, and even the tonic. In spite of this variation, the performance comes forward as both confident and stable; the varying intonations appear controlled – they are not performer mistakes. Still, our findings suggest that tones of longer duration seem to vary less in intonation than shorter notes. We show how our software can be used in combination with manual analysis, and argue that automated pitch analysis may be useful also in the analysis of larger collections of Norwegian folk music.
-
Watne, Åshild & Nymoen, Kristian (2018). Entreprenørskap i høyere norsk musikkutdanning. Nordisk musikkpedagogisk forskning : Årbok.
ISSN 1504-5021.
s 367- 385 Full text in Research Archive.
-
Nymoen, Kristian (2017). Expert Commentary: Gesture Following Made Accessible, In Alexander Refsum Jensenius & Michael J. Lyons (ed.),
A NIME Reader: Fifteen Years of New Interfaces for Musical Expression.
Springer Science+Business Media B.V..
ISBN 978-3-319-47213-3.
Expert Commentary.
s 281
- 282
-
Nymoen, Kristian; Danielsen, Anne & London, Justin (2017). Validating Attack Phase Descriptors Obtained by the Timbre Toolbox and MIRtoolbox, In Tapio Lokki; Jukka Pätynen & Vesa Välimäki (ed.),
Proceedings of the 14th Sound and Music Computing Conference 2017.
Aalto University.
ISBN 978-952-60-3729-5.
KAPITTEL.
s 214
- 219
Full text in Research Archive.
Show summary
The attack phase of sound events plays an important role in how sounds and music are perceived. Several approaches have been suggested for locating salient time points and critical time spans within the attack portion of a sound, and some have been made widely accessible to the research community in toolboxes for Matlab. While some work exists where proposed audio descriptors are grounded in listening tests, the approaches used in two of the most popular toolboxes for musical analysis have not been thoroughly compared against perceptual results. This article evaluates the calculation of attack phase descriptors in the Timbre toolbox and the MIRtoolbox by comparing their predictions to empirical results from a listening test. The results show that the default parameters in both toolboxes give inaccurate predictions for the sound stimuli in our experiment. We apply a grid search algorithm to obtain alternative parameter settings for these toolboxes that align their estimations with our empirical results.
-
Godøy, Rolf Inge; Song, Min-Ho; Nymoen, Kristian; Haugen, Mari Romarheim & Jensenius, Alexander Refsum (2016). Exploring Sound-Motion Similarity in Musical Experience. Journal of New Music Research.
ISSN 0929-8215.
45(3), s 210- 222 . doi:
10.1080/09298215.2016.1184689
Show summary
People tend to perceive many and also salient similarities between musical sound and body motion in musical experience, as can be seen in countless situations of music performance or listening to music, and as has been documented by a number of studies in the past couple of decades. The so-called motor theory of perception has claimed that these similarity relationships are deeply rooted in human cognitive faculties, and that people perceive and make sense of what they hear by mentally simulating the body motion thought to be involved in the making of sound. In this paper, we survey some basic theories of sound-motion similarity in music, and in particular the motor theory perspective. We also present findings regarding sound-motion similarity in musical performance, in dance, in so-called sound-tracing (the spontaneous body motions people produce in tandem with musical sound), and in sonification, all in view of providing a broad basis for understanding sound-motion similarity in music.
-
Haugen, Mari Romarheim & Nymoen, Kristian (2016). Evaluating Input Devices for Dance Research, In Richard Kronland-Martinet; Mitsuko Aramaki & Sølvi Ystad (ed.),
Music, Mind, and Embodiment :11th International Symposium, CMMR 2015, Plymouth, UK, June 16-19, 2015, Revised Selected Papers.
Springer.
ISBN 978-3-319-46281-3.
Sound, Motion and Gesture.
s 58
- 70
Full text in Research Archive.
Show summary
Recording music-related motions in ecologically valid situations can be challenging. We investigate the performance of three devices providing 3D acceleration data, namely Axivity AX3, iPhone 4s and a Wii controller tracking rhythmic motions. The devices are benchmarked against an infrared motion capture system, tested on both simple and complex music-related body motions, and evaluations are presented of the data quality and suitability for tracking music-related motions in real-world situations. The various systems represent different trade-offs with respect to data quality, user interface and physical attributes.
-
Nymoen, Kristian; Chandra, Arjun & Tørresen, Jim (2016). Self-awareness in Active Music Systems, In Peter R. Lewis; Marco Platzner; Bernhard Rinner; Jim Tørresen & Xin Yao (ed.),
Self-aware Computing Systems.
Springer.
ISBN 978-3-319-39674-3.
Kapittel 14.
s 279
- 296
-
Wang, Shuo; Nebehay, Georg; Esterle, Lukas; Nymoen, Kristian & Minku, Leandro L. (2016). Common Techniques for Self-awareness and Self-expression, In Peter R. Lewis; Marco Platzner; Bernhard Rinner; Jim Tørresen & Xin Yao (ed.),
Self-aware Computing Systems.
Springer.
ISBN 978-3-319-39674-3.
Kapittel 7.
s 113
- 142
-
Barrett, Natasha & Nymoen, Kristian (2015). Investigations in coarticulated performance gestures using interactive parameter-mapping 3D sonification.. Proceedings of the International Conference on Auditory Display.
ISSN 2168-5126.
-
Bojic, Iva & Nymoen, Kristian (2015). Survey on synchronization mechanisms in machine-to-machine systems. Engineering Applications of Artificial Intelligence.
ISSN 0952-1976.
45, s 361- 375 . doi:
10.1016/j.engappai.2015.07.007
Show summary
People have always tried to understand natural phenomena. In computer science natural phenomena are mostly used as a source of inspiration for solving various problems in distributed systems such as optimization, clustering, and data processing. In this paper we will give an overview of research in field of computer science where fireflies in nature are used as role models for time synchronization. We will compare two models of oscillators that explain firefly synchronization along with other phenomena of synchrony in nature (e.g., synchronization of pacemaker cells of the heart and synchronization of neuron networks of the circadian pacemaker). Afterwards, we will present Mirollo and Strogatz׳s pulse coupled oscillator model together with its limitations. As discussed by the authors of the model, this model lacks of explanation what happens when oscillators are nonidentical. It also does not support mobile and faulty oscillators. Finally, it does not take into consideration that in communication among oscillators there are communication delays. Since these limitations prevent Mirollo and Strogatz׳s model to be used in real-world environments (such as Machine-to-Machine systems), we will sum up related work in which scholars investigated how to modify the model in order for it to be applicable in distributed systems. However, one has to bear in mind that there are usually large differences between mathematical models in theory and their implementation in practice. Therefore, we give an overview of both mathematical models and mechanisms in distributed systems that were designed after them.
-
Haugen, Mari Romarheim & Nymoen, Kristian (2015). Evaluating Input Devices for Dance Research, In
Proceedings of the 11th International Symposium on Computer Music Multidisciplinary Research.
The Interdisciplinary Centre for Computer Music Research.
ISBN 978-2-909669-24-3.
VI - Sound, Motion and Gesture - Part 2.
s 263
- 270
Full text in Research Archive.
Show summary
Recording music-related motions in ecological valid situa- tions can be challenging. We investigate the performance of three devices providing 3D acceleration data, namely Axivity AX3, iPhone 4s and a Wii controller tracking rhythmic motions. The devices are benchmarked against an infrared motion capture system. The devices tracked simple and complex rhythmic motions to pre-recorded music and were evalu- ated both based on the data quality and also in terms of how suitable the systems seem for tracking music-related motions in real-world situa- tions. The various systems represent different trade offs with respect to timing, accuracy and precision.
-
Nymoen, Kristian; Chandra, Arjun; Glette, Kyrre Harald & Tørresen, Jim (2015). Decentralized harmonic synchronization in mobile music systems, In Kurosh Madani; Neil Y. Yen & Yusuke Manabe (ed.),
Awareness Science and Technology (iCAST), 2014 IEEE 6th International Conference on.
IEEE.
ISBN 978-1-4799-7373-6.
Artikkel i proceedings.
s 51
- 56
Show summary
A system for decentralized synchronization of musical agents is presented, inspired by Mirollo and Strogatz' pulse-coupled oscillator model of the synchronous flashing of certain species of firefly. While most previous work on pulse-coupled oscillators assume fixed and (close to) equal oscillator frequencies, the presented system tackles the challenge of different starting frequencies. Open source implementations in Puredata, Max, and Matlab are provided. Test results for setups of six nodes show that nodes reach a state of harmonic synchrony, where fire events coincide and oscillators display integer-ratio frequency relations.
-
Nymoen, Kristian; Haugen, Mari Romarheim & Jensenius, Alexander Refsum (2015). MuMYO – Evaluating and Exploring the MYO Armband for Musical Interaction, In Edgar Berdahl (ed.),
Proceedings of the International Conference on New Interfaces For Musical Expression.
Louisiana State University.
Chapter.
Full text in Research Archive.
Show summary
The MYO armband from Thalmic Labs is a complete and wireless motion and muscle sensing platform. This paper evaluates the armband’s sensors and its potential for NIME applications. This is followed by a presentation of the prototype instrument MuMYO. We conclude that, despite some shortcomings, the armband has potential of becoming a new “standard” controller in the NIME community.
-
Nymoen, Kristian; Chandra, Arjun; Glette, Kyrre Harald; Tørresen, Jim; Voldsund, Arve & Jensenius, Alexander Refsum (2014). PheroMusic: Navigating a Musical Space for Active Music Experiences, In Georgios Kouroupetroglou & Anastasia Georgaki (ed.),
Music Technology Meets Philosophy: From digital echos to virtual ethos Proceedings of the 40th International Computer Music Conference joint with the 11th Sound and Music Computing conference.
International Computer Music Association.
ISBN 978-0-9845274-3-4.
Kapittel.
s 1715
- 1718
Full text in Research Archive.
Show summary
We consider the issue of how a flexible musical space can be manipulated by users of an active music system. The musical space is navigated within by selecting transitions between different sections of the space. We take inspiration from pheromone trails in ant colonies to propose and investigate an approach that allows an artificial agent to navigate such musical spaces in accordance with the preferences of the user, and a set of boundaries specified by the designer of the musical space.
-
Nymoen, Kristian; Chandra, Arjun & Tørresen, Jim (2014). The Challenge of Decentralised Synchronisation in Interactive Music Systems, In Ingo Scholtes (ed.),
Self-Adaptation and Self-Organizing Systems Workshops (SASOW), 2013 IEEE 7th International Conference on.
IEEE.
ISBN 978-1-4799-5086-7.
Artikkel i proceedings.
s 95
- 100
Show summary
Synchronisation is an important part of collaborative music systems, and with such systems implemented on mobile devices, the implementation of algorithms for synchronisation without central control becomes increasingly important. Decentralised synchronisation has been researched in many areas, and some challenges are solved. However, some of the assumptions that are often made in this research are not suitable for mobile musical systems. We present an implementation of a firefly-inspired algorithm for synchronisation of musical agents with fixed and equal tempo, and lay out the road ahead towards synchronisation between agents with large differences in tempo. The effect of introducing human-controlled nodes in the network of otherwise agent-controlled nodes is examined.
-
Nymoen, Kristian; Tørresen, Jim; Song, Sichao & Hafting, Yngve (2014). Funky Sole Music: Gait Recognition and Adaptive Mapping, In Baptiste Caramiaux; Koray Tahiroğlu; Rebecca Fiebrink & Atau Tanaka (ed.),
Proceedings of the International Conference on New Interfaces For Musical Expression.
Goldsmiths, University of London.
ISBN 978-1-906897-29-1.
Kapittel.
s 299
- 302
Full text in Research Archive.
Show summary
We present Funky Sole Music, a musical interface employing a sole embedded with three force sensitive resistors in combination with a novel algorithm for continuous movement classification. A heuristics-based music engine has been implemented, allowing users to control high-level parameters of the musical output. This provides a greater degree of control to users without musical expertise compared to what they get with traditional media playes. By using the movement classification result not as a direct control action in itself, but as a way to change mapping spaces and musical sections, the control possibilities offered by the simple interface are greatly increased.
-
Chandra, Arjun; Nymoen, Kristian; Voldsund, Arve; Jensenius, Alexander Refsum; Glette, Kyrre Harald & Tørresen, Jim (2013). Market-based Control in Interactive Music Environments, In Mitsuko Aramaki; Mathieu Barthet; Richard Kronland-Martinet & Sølvi Ystad (ed.),
From Sounds to Music and Emotions - 9th International Symposium, CMMR 2012, London, UK, June 19-22, 2012, Revised Selected Papers.
Springer.
ISBN 978-3-642-41247-9.
Chapter.
s 439
- 458
Show summary
The paper presents the interactive music system SoloJam, which allows a group of participants with little or no musical training to effectively play together in a “band-like” setting. It allows the participants to take turns playing solos made up of rhythmic pattern sequences. We specify the issue at hand for enabling such participation as being the requirement of decentralised coherent circulation of playing solos. Satisfying this requirement necessitates some form of intelligence within the devices used for participation, with each participant being associated with their respective enabling device. Markets consist of buyers and sellers, which interact with each other in order to trade commodities. Based on this idea, we let devices enable buying and selling, more precisely bidding and auctioneering, and assist participants trade in musical terms. Consequentially, the intelligence in the devices is modelled as their ability to help participants trade solo playing responsibilities with each other. This requires them to possess the capability of assessing the utility of the associated participant’s deservedness of being the soloist, the capability of holding auctions on behalf of the participant, and of enabling the participant bid within these auctions. We show that holding auctions and helping bid within them enables decentralisation of co-ordinating solo circulation, and a properly designed utility function enables coherence in the musical output. The market-based approach helps achieve decentralised coherent circulation with artificial agents simulating human participants. The effectiveness of the approach is further supported when human users participate. As a result, the approach is shown to be effective at enabling participants with little or no musical training to play together in SoloJam.
-
Nymoen, Kristian; Godøy, Rolf Inge; Jensenius, Alexander Refsum & Tørresen, Jim (2013). Analyzing Correspondence between Sound Objects and Body Motion. ACM Transactions on Applied Perception.
ISSN 1544-3558.
10(2) . doi:
10.1145/2465780.2465783
Full text in Research Archive.
Show summary
Links between music and body motion can be studied through experiments called sound-tracing. One of the main challenges in such research is to develop robust analysis techniques that are able to deal with the multidimensional data that musical sound and body motion present. The article evaluates four different analysis methods applied to an experiment in which participants moved their hands following perceptual features of short sound objects. Motion capture data has been analyzed and correlated with a set of quantitative sound features using four different methods: (a) a pattern recognition classifier, (b) t-tests, (c) Spearman’s ρ correlation, and (d) canonical correlation. This article shows how the analysis methods complement each other, and that applying several analysis techniques to the same data set can broaden the knowledge gained from the experiment.
-
Skogstad, Ståle Andreas van Dorp; Nymoen, Kristian; Høvin, Mats Erling; Holm, Sverre & Jensenius, Alexander Refsum (2013). Filtering Motion Capture Data for Real-Time Applications, In Woon Seung Yeo; Kyogu Lee; Alexander Sigman; Haru Ji & Wakefield Graham (ed.),
Proceedings of the International Conference on New Interfaces For Musical Expression.
Korea Advance Institute of Science and Technology.
Kapittel.
s 142
- 147
Full text in Research Archive.
Show summary
In this paper we present some custom designed filters for real-time motion capture applications. Our target application is motion controllers, i.e. systems that interpret hand motion for musical interaction. In earlier research we found effective methods to design nearly optimal filters for realtime applications. However, to be able to design suitable filters for our target application, it is necessary to establish the typical frequency content of the motion capture data we want to filter. This will again allow us to determine a reasonable cutoff frequency for the filters. We have therefore conducted an experiment in which we recorded the hand motion of 20 subjects. The frequency spectra of these data together with a method similar to the residual analysis method were then used to determine reasonable cutoff frequencies. Based on this experiment, we propose three cutoff frequencies for different scenarios and filtering needs: 5, 10 and 15 Hz, which correspond to heavy, medium and light filtering, respectively. Finally, we propose a range of real-time filters applicable to motion controllers. In particular, low-pass filters and low-pass differentiators of degrees one and two, which in our experience are the most useful filters for our target application.
-
Tørresen, Jim; Hafting, Yngve & Nymoen, Kristian (2013). A new wi-fi based platform for wireless sensor data collection. In Proceedings of the International Conference on New Interfaces For Musical Expression, In Woon Seung Yeo; Kyogu Lee; Alexander Sigman; Haru Ji & Wakefield Graham (ed.),
Proceedings of the International Conference on New Interfaces For Musical Expression.
Korea Advance Institute of Science and Technology.
22.
s 337
- 340
-
Chandra, Arjun; Nymoen, Kristian; Voldsund, Arve; Jensenius, Alexander Refsum; Glette, Kyrre Harald & Tørresen, Jim (2012). Enabling Participants to Play Rhythmic Solos Within a Group via Auctions, In Richard Kronland-Martinet; Sølvi Ystad; Mitsuko Aramaki; Mathieu Barthet & Simon Dixon (ed.),
Proceedings of the 9th International Symposium on Computer Music Modeling and Retrieval.
Queen Mary, University of London.
Chapter.
s 674
- 689
Full text in Research Archive.
Show summary
The paper presents the interactive music system SoloJam, which allows a group of participants with little or no musical training to effectively play together in a ``band-like'' setting. It allows the participants to take turns playing solos made up of rhythmic pattern sequences. We specify the issue at hand for allowing such participation as being the requirement of decentralised coherent circulation of playing solos. This is to be realised by some form of intelligence within the devices used for participation. Here we take inspiration from the Economic Sciences, and propose this intelligence to take the form of making devices possessing the capability of evaluating their utility of playing the next solo, the capability of holding auctions, and of bidding within them. We show that holding auctions and bidding within them enables decentralisation of co-ordinating solo circulation, and a properly designed utility function enables coherence in the musical output. The approach helps achieve decentralised coherent circulation with artificial agents simulating human participants. The effectiveness of the approach is further supported when human users participate. As a result, the approach is shown to be effective at enabling participants with little or no musical training to play together in SoloJam.
-
Godøy, Rolf Inge; Jensenius, Alexander Refsum; Voldsund, Arve; Glette, Kyrre Harald; Høvin, Mats Erling; Nymoen, Kristian; Skogstad, Ståle Andreas van Dorp & Tørresen, Jim (2012). Classifying Music-Related Actions, In Emilios Cambouropoulos; Costas Tsougras; Panayotis Mavromatis & Konstantinos Pastiadis (ed.),
Proceedings of the ICMPC-ESCOM 2012 Joint Conference: 12th Biennial International Conference for Music Perception and Cognition, 8th Triennial Conference of the European Society for the Cognitive Sciences of Music.
School of Music Studies, Aristotle University of Thessaloniki Thessaloniki, Hellas.
ISBN 978-960-99845-1-5.
Artikkel i Proceedings.
s 352
- 357
Full text in Research Archive.
Show summary
Our research on music-related actions is based on the conviction that sensations of both sound and body motion are inseparable in the production and perception of music. The expression "musicrelated actions" is here used to refer to chunks of combined sound and body motion, typically in the duration range of approximately 0.5 to 5 seconds. We believe that chunk-level music-related actions are highly significant for the experience of music, and we are presently working on establishing a database of music-related actions in order to facilitate access to, and research on, our fast growing collection of motion capture data and related material. In this work, we are confronted with a number of perceptual, conceptual and technological issues regarding classification of music-related actions, issues that will be presented and discussed in this paper.
-
Jensenius, Alexander Refsum; Nymoen, Kristian; Skogstad, Ståle Andreas van Dorp & Voldsund, Arve (2012). A Study of the Noise-Level in Two Infrared Marker-Based Motion Capture Systems, In Bob L. Sturm; Sofia Dahl & Jan Larsen (ed.),
Proceedings of the 9th Sound and Music Computing Conference - "Illusions".
Logos Verlag Berlin.
ISBN 9783832531805.
Paper.
s 258
- 263
Full text in Research Archive.
Show summary
With musical applications in mind, this paper reports on the level of noise observed in two commercial infrared marker-based motion capture systems: one high-end (Qualisys) and one affordable (OptiTrack). We have tested how various features (calibration volume, marker size, sampling frequency, etc.) influence the noise level of markers lying still, and fixed to subjects standing still. The conclusion is that the motion observed in humans standing still is usually considerably higher than the noise level of the systems. Dependent on the system and its calibration, however, the signal-to-noise-ratio may in some cases be problematic.
-
Kozak, Mariusz; Nymoen, Kristian & Godøy, Rolf Inge (2012). The Effects of Spectral Features of Sound on Gesture Type and Timing, In Eleni Efthimiou; Georgios Kouroupetroglou & Stavroula-Evita Fotinea (ed.),
Gesture and Sign Language in Human-Computer Interaction and Embodied Communication.
Springer.
ISBN 978-3-642-34181-6.
KAPITTEL.
s 69
- 80
-
Nymoen, Kristian; Tørresen, Jim; Godøy, Rolf Inge & Jensenius, Alexander Refsum (2012). A Statistical Approach to Analyzing Sound Tracings, In Sølvi Ystad; Mitsuko Aramaki; Richard Kronland-Martinet; Kristoffer Jensen & Sanghamitra Mohanty (ed.),
Speech, Sound and Music Processing: Embracing Research in India. 8th International Symposium, CMMR 2011. 20th International Symposium, FRSM 2011.
Springer.
ISBN 978-3-642-31979-2.
Kapittel i bok.
s 120
- 145
Full text in Research Archive.
Show summary
This paper presents an experiment on sound tracing, meaning an experiment on how people relate motion to sound. 38 participants were presented with 18 short sounds, and instructed to move their hands in the air while acting as though the sound was created by their hand motion. The hand motion of the participants was recorded, and has been analyzed using statistical tests, comparing results between different sounds, between different subjects, and between different sound classes. We have identified several relationships between sound and motion which are present in the majority of the subjects. A clear distinction was found in onset acceleration for motion to sounds with an impulsive dynamic envelope compared to non-impulsive sounds. Furthermore, vertical movement has been shown to be related to sound frequency, both in terms of spectral centroid and pitch. Moreover, a significantly higher amount of overall acceleration was observed for non-pitched sounds as compared to pitched sounds.
View all works in Cristin
-
Kjørstad, Elise & Nymoen, Kristian (2020, 10. mai). Kunstig intelligens har komponert sanger til Eurovision-inspirert konkurranse. [Internett].
forskning.no.
-
Wallace, Benedikte; Nymoen, Kristian; Martin, Charles Patrick & Tørresen, Jim (2020). Towards Movement Generation with Audio Features.
-
Wallace, Benedikte; Nymoen, Kristian & Martin, Charles Patrick (2019). Tracing from Sound to Movement with Mixture Density Recurrent Neural Networks.
-
Becker, Artur; Herrebrøden, Henrik; Gonzalez Sanchez, Victor Evaristo; Nymoen, Kristian; Dal Sasso Freitas, Carla Maria; Tørresen, Jim & Jensenius, Alexander Refsum (2019). Functional Data Analysis of Rowing Technique Using Motion Capture Data.
Show summary
We present an approach to analyzing the motion capture data ofrowers using bivariate functional principal component analysis(bfPCA). The method has been applied on data from six elite rowersrowing on an ergometer. The analyses of the upper and lower bodycoordination during the rowing cycle revealed significant differences between the rowers, even though the data was normalized toaccount for differences in body dimensions. We make an argumentfor the use of bfPCA and other functional data analysis methods forthe quantitative evaluation and description of technique in sports.
-
Câmara, Guilherme Schmidt; Nymoen, Kristian & Danielsen, Anne (2019). Timing is Everything... Or is it? Part II: Effects of Instructed Timing Style and Timing Reference on Drum-Kit Sound in Groove Performance.
-
Câmara, Guilherme Schmidt; Nymoen, Kristian; Lartillot, Olivier & Danielsen, Anne (2019). Timing is Everything... Or is it? Part I: Effects of Instructed Timing and Reference on Guitar and Bass Sound in Groove Performance.
-
Danielsen, Anne & Nymoen, Kristian (2019). Where is the beat in that note?.
-
Danielsen, Anne; Nymoen, Kristian & London, Justin (2019). Noise in the click or click in the noise: Investigating probe-stimulus order in P-center estimation tasks.
-
Jensenius, Alexander Refsum; Martin, Charles Patrick; Erdem, Cagri; Lan, Qichao; Fuhrer, Julian Peter; Gonzalez Sanchez, Victor Evaristo; Zelechowska, Agata; Glette, Kyrre & Nymoen, Kristian (2019). Self-playing Guitars.
Show summary
In this installation we explore how six self-playing guitars can entrain to each other. When they are left alone they will revert to playing a common pulse. As soon as they sense people in their surroundings they will start entraining to other pulses. The result is a fascinating exploration of a basic physical and cognitive concept, and the musically interesting patterns that emerge on the border between order and chaos.
-
Nymoen, Kristian (2019). Musikalske maskiner.
-
Sioros, George; Câmara, Guilherme Schmidt; Danielsen, Anne & Nymoen, Kristian (2019). Timing and drummers’ movement: A novel methodology for performance analysis.
Show summary
Timing is an important aspect of groove music. The relationship between musicians’ body motion in performance and timing is, however, as yet not well understood Timing and drummers’ movement: A novel methodology for performance analysis . In the present study we recorded movement of 20 drummers performing the same rhythmic pattern under four different timing instructions: natural, on- the-beat, laid-back and pushed. Motion capture data synchronized to audio recordings of their performances were collected as part of a larger experimental project. This presentation focuses on our method for analyzing motion capture data. The aim of the analysis is a) to identify common movement strategies for sub-groups of drummers, and b) to identify strategies for achieving the four different timing conditions across drummers. In this presentation we focus on the movement of the left arm, and particularly on the preparation and rebound phase of the snare strokes. To explore and analyze the data without statistically testing a priori hypotheses about specific performance techniques, we combined existing practices from different disciplines into a novel methodology. First, we reduce the data into motion templates (Müller and Röder 2006). We design a set of 22 binary features to describe the movement of the arm. Second, we perform a phylogenetic analysis of the motion templates, in which we identify clusters within each timing condition. A comparison between clusters reveals differences in the coordination of the participants’ movements that correspond to the different performance strategies. Preliminary analysis has shown distinct clusters within all timing conditions that differ in specific features. For instance, we observe three groups of participants within the “natural” condition that differ in the flexion of the wrist and elbow. Besides our findings we will present the details of the methodology, which can be applied in the study of music-related movements beyond the scope of this project.
-
Lartillot, Olivier; Nymoen, Kristian & Danielsen, Anne (2018). Prediction of P-centers from audio recordings.
-
London, Justin; Danielsen, Anne & Nymoen, Kristian (2018). Where is the beat in that note? Effects of attack, frequency, and duration on the p-centers of musical and quasi-musical sounds.
-
Nymoen, Kristian (2018). Music Information Retrieval.
-
Nymoen, Kristian (2018). Soundtracer - Et innblikk i innholdsbasert arkivsøk.
Show summary
Et musikkopptak inneholder store mengder informasjon. I fagfaltet Music Information Retrieval forskes det på teknikker for å hente ut informasjon fra selve lydsignalet. Disse teknikkene gir nye og spennende muligheter for søk i lydarkiv – enten som supplement til, eller i stedet for, metadata. Hvordan kan man søke i et lydarkiv etter et bestemt intonasjonsmønster eller en bestemt rytmikk? Og hva skal til for å søke i et arkiv ved å nynne, klappe, eller bevege seg?
-
Danielsen, Anne; London, Justin & Nymoen, Kristian (2017). Mapping the Beat Bin: The Effects of Rise Time, Duration, and Frequency Range on the Perceived Timing of Musical Sounds.
-
Danielsen, Anne; London, Justin & Nymoen, Kristian (2017). Mapping the beat bin: Effects of rise time, duration and frequency range on the perceived timing (P-center) of musical sounds.
-
Danielsen, Anne; Nymoen, Kristian; Haugen, Mari Romarheim & Câmara, Guilherme Schmidt (2017). Project presentation: Timing and Sound in Musical Microrhythm (TIME).
-
London, Justin; Nymoen, Kristian; Thompson, Marc; Code, David Loberg & Danielsen, Anne (2017). Where is the beat in that note? Comparing methods for identifying the p-center of musical sounds.
-
London, Justin; Nymoen, Kristian; Thompson, Marc; Code, David Loberg & Danielsen, Anne (2017). Where is the beat in that note? Comparing methods for identifying the p-center of musical sounds.
-
Nymoen, Kristian (2017). Xsens Motion Capture as Musical Instrument.
-
Nymoen, Kristian; Danielsen, Anne & London, Justin (2017). Validating Attack Phase Descriptors Obtained by the Timbre Toolbox and MIRtoolbox.
Show summary
The attack phase of sound events plays an important role in how sounds and music are perceived. Several approaches have been suggested for locating salient time points and critical time spans within the attack portion of a sound, and some have been made widely accessible to the research community in toolboxes for Matlab. While some work exists where proposed audio descriptors are grounded in listening tests, the approaches used in two of the most popular toolboxes for musical analysis have not been thoroughly compared against perceptual results. This article evaluates the calculation of attack phase descriptors in the Timbre toolbox and the MIRtoolbox by comparing their predictions to empirical results from a listening test. The results show that the default parameters in both toolboxes give inaccurate predictions for the sound stimuli in our experiment. We apply a grid search algorithm to obtain alternative parameter settings for these toolboxes that align their estimations with our empirical results.
-
Watne, Åshild & Nymoen, Kristian (2017). Entrepreneurship in Higher Music Education in Norway.
-
Godøy, Rolf Inge; Song, Min-Ho; Nymoen, Kristian; Haugen, Mari Romarheim & Jensenius, Alexander Refsum (2016, 13. juli). ¿Por qué marcamos el ritmo de la música con los pies?. [Internett].
BBC Mundo.
Show summary
¿Alguna vez has estado en un bar o en un restaurante, sentado en la calle o en un salón cuando suena música y tú y otros empiezan a golpear el piso con el pie al ritmo de la música?
-
Jensenius, Alexander Refsum & Nymoen, Kristian (2016). Muskel- og bevegelsesmusikk.
-
Jensenius, Alexander Refsum & Nymoen, Kristian (2016). Velkommen til Institutt for musikkvitenskap.
-
Jensenius, Alexander Refsum; Zeiner-Henriksen, Hans T. & Nymoen, Kristian (2016). Music Moves: Why does music make you move?. [www
].
Show summary
Learn about the psychology of music and movement, and how researchers study music-related movements, with this free online course.
-
Haugen, Mari Romarheim & Nymoen, Kristian (2015). Evaluating Input Devices for Dance Research.
-
Jensenius, Alexander Refsum; Nymoen, Kristian; Haugen, Mari Romarheim; Andersen, Ida & Evertsson, Henrik (2015, 18. mars). Ubevegelig norgesrekord.
Universitas.
Show summary
Torsdag 12. mars var det duket for historiens andre norgesmesterskap i stillstand.
-
Jensenius, Alexander Refsum; Nymoen, Kristian; Tveit, Anders; Haugen, Mari Romarheim; Solberg, Ragnhild Torvanger; Eikeland, Håkon Bachken & Andersen, Thomas Dahl (2015). OMO på Komdagen.
-
Nymoen, Kristian (2015). MYO-music, demo.
-
Nymoen, Kristian; Haugen, Mari Romarheim & Jensenius, Alexander Refsum (2015). MuMYO – Evaluating and Exploring the MYO Armband for Musical Interaction.
Show summary
The MYO armband from Thalmic Labs is a complete and wireless motion and muscle sensing platform. This paper evaluates the armband’s sensors and its potential for NIME applications. This is followed by a presentation of the prototype instrument MuMYO. We conclude that, despite some shortcomings, the armband has potential of becoming a new “standard” controller in the NIME community.
-
Nymoen, Kristian & Sigurjonsdottir, Sol (2015, 06. mars). Hackere skal utvikle fremtidens musikkløsninger på Bylarm. [Internett].
Aftenposten/Osloby.
-
Nymoen, Kristian; Tørresen, Jim; Drury, Jim & Chen, Roselle (2015, 10. juli). Sole to soul - foot tapping to a new beat. [Internett].
Reuters video.
-
Olaisen, Rolf Petter & Nymoen, Kristian (2015, 05. mars). Konkurranse på By:Larm. [TV].
NRK Dagsrevyen 21, NRK Distriktsnyheter Østlandssendingen.
Show summary
På by:Larm 2015 arrangeres Norges aller første Music Hackathon, et konsept som årlig avholdes på flere av verdens største musikkfestivaler.
-
Jensenius, Alexander Refsum & Nymoen, Kristian (2014). MakerCon demo: Motion capture and music.
-
Knutzen, Håkon; Glette, Kyrre Harald; Nymoen, Kristian; Chandra, Arjun & Tørresen, Jim (2014). PheroMusic.
Show summary
Listen to music and control it with your own motion and selections! Pheromusic provides five different musical scenarios called "soundscapes". In each scenario, music is generated automatically. - Control the way the music is generated by tiliting your iOS device. - Select a new soundscape on the screen. - If you don't make a selection within 20 seconds, your device makes one for you. - A machine learning algorithm is used to make the device remember your choices. - Press and hold on a soundscape, to schedule the next without moving there immediately. To navigate between soundscapes Pheromusic uses a nature-inspired algorithm to implement musical memory in an artificial agent. In nature, ants deposit pheromones when looking for food. The pheromone trails are used to locate the same food source in the future. In PheroMusic, an artificial agent navigates between a number of soundscapes. Users may make active choices and decide the next soundscape to which the agent moves. When a new soundscape is selected, an artificial pheromone trail is created, influencing the path taken by the user in the future. The research leading to the development of this app has received funding from the European Union Seventh Framework Programme under grant agreement no. 257906
-
Nymoen, Kristian (2014). Musikkteknologi som beveger: Fremtidens musikkinstrumenter.
Show summary
Bevegelse har alltid vært en del av musikken – fra musikerens lydproduserende bevegelser til lytterens dans. Med moderne bevegelsesteknologi kan kroppsbevegelser fra f.eks. dansere fanges opp og prosesseres i datamaskiner. Kan vi designe et interaktivt musikksystem hvor dans lager musikk? Hva skal til for at kroppsbevegelsene virkelig passer til musikken som kommer ut? Og hvordan brukes egentlig sensorer til å fange kroppsbevegelser? Kristian viser hvordan en avansert drakt for bevegelsessporing (mocap-drakt) kan brukes til å gi helt nye musikkopplevelser.
-
Nymoen, Kristian; Chandra, Arjun; Glette, Kyrre Harald & Tørresen, Jim (2014). Decentralized harmonic synchronization in mobile music systems.
-
Nymoen, Kristian; Chandra, Arjun; Glette, Kyrre Harald; Tørresen, Jim; Voldsund, Arve & Jensenius, Alexander Refsum (2014). PheroMusic: Navigating a Musical Space for Active Music Experiences.
Show summary
We consider the issue of how a flexible musical space can be manipulated by users of an active music system. The musical space is navigated within by selecting transitions between different sections of the space. We take inspiration from pheromone trails in ant colonies to propose and investigate an approach that allows an artificial agent to navigate such musical spaces in accordance with the preferences of the user, and a set of boundaries specified by the designer of the musical space.
-
Nymoen, Kristian & Knoppers, Rijkert (2014, 19. november). Smartphone past muziek aan aan gemoed.
NRC Handelsblad.
-
Nymoen, Kristian; Song, Sichao; Hafting, Yngve & Tørresen, Jim (2014). Funky Sole Music: Gait Recognition and Adaptive Mapping.
-
Nymoen, Kristian & Vogt, Yngve (2014, 14. november). New mobile tech makes music to suit moods. [Internett].
The Local.
-
Tørresen, Jim & Nymoen, Kristian (2014). Kunstig klok. Dagens næringsliv.
ISSN 0803-9372.
(139), s 35- 35
-
Tørresen, Jim; Nymoen, Kristian & Vogt, Yngve (2014, 05. november). Musikken blir aldri den samme. [Internett].
forskning.no.
-
Tørresen, Jim; Nymoen, Kristian & Vogt, Yngve (2014, 10. november). Software uses smartphone sensors to manipulate music. [Internett].
Phys.org.
-
Tørresen, Jim; Nymoen, Kristian & Vogt, Yngve (2014, 13. november). Tecnologia – Software em desenvolvimento na Noruega promete revolucionar forma de ouvir música. [Internett].
Portal Bragança.
-
Tørresen, Jim; Nymoen, Kristian & Vogt, Yngve (2014, 22. november). Una app capta tu estado de ánimo y le asocia una música. [Internett].
modetvideo.com.uy.
-
Vogt, Yngve; Abajo, Carlos Gómez; Tørresen, Jim & Nymoen, Kristian (2014, 20. november). Una app capta tu estado de ánimo y le pone la música adecuada. [Internett].
La Razon.
-
Vogt, Yngve; Abajo, Carlos Gómez; Tørresen, Jim & Nymoen, Kristian (2014, 17. november). Una 'app' capta tu estado de ánimo y le pone la música adecuada, a través del smartphone. [Internett].
Tendencias21.
-
Vogt, Yngve; Tørresen, Jim & Nymoen, Kristian (2014, 10. november). Music Will Never Be the Same. [Internett].
ACM TechNews.
-
Vogt, Yngve; Tørresen, Jim & Nymoen, Kristian (2014, 14. november). Music software Reads Body Language to create Playlist based on Mood. [Internett].
Macedonian International News Agency.
-
Vogt, Yngve; Tørresen, Jim & Nymoen, Kristian (2014, 07. november). Musikken blir aldri den samme. [Fagblad].
Apollon.
-
Vogt, Yngve; Tørresen, Jim & Nymoen, Kristian (2014, 13. november). Novo software promete revolucionar forma de ouvir música. [Internett].
sapo.pt.
-
Vogt, Yngve; Tørresen, Jim & Nymoen, Kristian (2014, 24. november). Você e seu celular vão mudar o ritmo da música. [Internett].
Inovação Tecnológica.
-
Nymoen, Kristian; Chandra, Arjun & Tørresen, Jim (2013). Firefly with me: distributed synchronization of musical agents. Awareness Magazine.
. doi:
10.2417/3201311.005187
Show summary
A system where individual musical nodes are aware of others and adapt to reach a synchronous state provides a collaborative active music performance for smartphones.
-
Nymoen, Kristian; Chandra, Arjun & Tørresen, Jim (2013). The Challenge of Decentralised Synchronisation in Interactive Music Systems.
-
Nymoen, Kristian; Chandra, Arjun; Ånonsen, Simon; Anthonsen, Ingeborg Olavsrud & Andresen, Ola Haukland (2013). Oslo mobile orchestra på Læring for framtiden.
-
Nymoen, Kristian; Tørresen, Jim; Godøy, Rolf Inge; Jensenius, Alexander Refsum & Høvin, Mats Erling (2013). Methods and Technologies for Analysing Links Between Musical Sound and Body Motion. Series of dissertations submitted to the Faculty of Mathematics and Natural Sciences, University of Oslo.. 1291.
Show summary
There are strong indications that musical sound and body motion are related. For instance, musical sound is often the result of body motion in the form of sound-producing actions, and musical sound may lead to body motion such as dance. The research presented in this dissertation is focused on technologies and methods of studying lower-level features of motion, and how people relate motion to sound. Two experiments on so-called sound-tracing, meaning representation of perceptual sound features through body motion, have been carried out and analysed quantitatively. The motion of a number of participants has been recorded using state-of- the-art motion capture technologies. In order to determine the quality of the data that has been recorded, these technologies themselves are also a subject of research in this thesis. A toolbox for storing and streaming music-related data is presented. This toolbox allows synchronised recording of motion capture data from several systems, independently of system-specific characteristics like data types or sampling rates. The thesis presents evaluations of four motion tracking systems used in research on music-related body motion. They include the Xsens motion capture suit, optical infrared marker-based systems from NaturalPoint and Qualisys, as well as the inertial sensors of an iPod Touch. These systems cover a range of motion tracking technologies, from state-of-the-art to low-cost and ubiquitous mobile devices. Weaknesses and strengths of the various systems are pointed out, with a focus on applications for music performance and analysis of music-related motion. The process of extracting features from motion data is discussed in the thesis, along with motion features used in analysis of sound-tracing experiments, including time-varying features and global features. Features for realtime use are also discussed related to the development of a new motion-based musical instrument: The SoundSaber. Finally, four papers on sound-tracing experiments present results and methods of analysing people’s bodily responses to short sound objects. These papers cover two experiments, presenting various analytical approaches. In the first experiment participants moved a rod in the air to mimic the sound qualities in the motion of the rod. In the second experiment the participants held two handles and a different selection of sound stimuli was used. In both experiments optical infrared marker-based motion capture technology was used to record the motion. The links between sound and motion were analysed using four approaches. (1) A pattern recognition classifier was trained to classify sound-tracings, and the performance of the classifier was analysed to search for similarity in motion patterns exhibited by participants. (2) Spearman’s p correlation was applied to analyse the correlation between individual sound and motion features. (3) Canonical correlation analysis was applied in order to analyse correlations between combinations of sound features and motion features in the sound-tracing experiments. (4) Traditional statistical tests were applied to compare sound-tracing strategies between a variety of sounds and participants differing in levels of musical training. Since the individual analysis methods provide different perspectives on the links between sound and motion, the use of several methods of analysis is recommended to obtain a broad understanding of how sound may evoke bodily responses.
-
Skogstad, Ståle Andreas van Dorp; Nymoen, Kristian; Høvin, Mats Erling; Holm, Sverre & Jensenius, Alexander Refsum (2013). Filtering Motion Capture Data for Real-Time Applications.
Show summary
In this paper we present some custom designed filters for real-time motion capture applications. Our target application is motion controllers, i.e. systems that interpret hand motion for musical interaction. In earlier research we found effective methods to design nearly optimal filters for realtime applications. However, to be able to design suitable filters for our target application, it is necessary to establish the typical frequency content of the motion capture data we want to filter. This will again allow us to determine a reasonable cutoff frequency for the filters. We have therefore conducted an experiment in which we recorded the hand motion of 20 subjects. The frequency spectra of these data together with a method similar to the residual analysis method were then used to determine reasonable cutoff frequencies. Based on this experiment, we propose three cutoff frequencies for different scenarios and filtering needs: 5, 10 and 15 Hz, which correspond to heavy, medium and light filtering, respectively. Finally, we propose a range of real-time filters applicable to motion controllers. In particular, low-pass filters and low-pass differentiators of degrees one and two, which in our experience are the most useful filters for our target application.
-
Chandra, Arjun; Nymoen, Kristian; Voldsund, Arve; Jensenius, Alexander Refsum; Glette, Kyrre Harald & Tørresen, Jim (2012). Enabling Participants to Play Rhythmic Solos Within a Group via Auctions.
Show summary
The paper presents the interactive music system SoloJam, which allows a group of participants with little or no musical training to effectively play together in a “band-like” setting. It allows the participants to take turns playing solos made up of rhythmic pattern sequences. We specify the issue at hand for allowing such participation as being the requirement of decentralised coherent circulation of playing solos. This is to be realised by some form of intelligence within the devices used for participation. Here we take inspiration from the Economic Sciences, and propose this intelligence to take the form of making devices possessing the capability of evaluating their utility of playing the next solo, the capability of holding auctions, and of bidding within them. We show that holding auctions and bidding within them enables decentralisation of co-ordinating solo circulation, and a properly designed utility function enables coherence in the musical output. The approach helps achieve decentralised coherent circulation with artificial agents simulating human participants. The effectiveness of the approach is further supported when human users participate. As a result, the approach is shown to be effective at enabling participants with little or no musical training to play together in SoloJam.
-
Godøy, Rolf Inge; Jensenius, Alexander Refsum; Voldsund, Arve; Glette, Kyrre Harald; Høvin, Mats Erling; Nymoen, Kristian; Skogstad, Ståle Andreas van Dorp & Tørresen, Jim (2012). Classifying Music-Related Actions.
Show summary
Our research on music-related actions is based on the conviction that sensations of both sound and body motion are inseparable in the production and perception of music. The expression "music-related actions" is here used to refer to chunks of combined sound and body motion, typically in the duration range of approximately 0.5 to 5 se¬conds. We believe that chunk-level music-related ac¬tions are highly signifi¬cant for the experience of music, and we are pres¬ently working on establishing a database of music-related actions in order to facilitate access to, and research on, our fast growing collection of motion capture data and related material. In this work, we are con¬fronted with a number of perceptual, concep¬tual and technological is¬sues regarding classification of music-related ac¬tions, issues that will be presented and discussed in this paper.
-
Jensenius, Alexander Refsum; Knutzen, Håkon; Nymoen, Kristian; Tveit, Anders; Voldsund, Arve; Madsen, Tommy; Støver, Catherine & Bekkedal, Even (2012). Oslo iPhone Ensemble på åpningen av Realfagsbiblioteket.
-
Jensenius, Alexander Refsum; Nymoen, Kristian; Chandra, Arjun; Halmrast, Tor; Rødseth, Alexander; Johansen, Lisa Schøne; Caspersen, Martin; Røsten, Hauk J. & Antonsen, Håvard (2012). OMO performance: Squeeky.
-
Jensenius, Alexander Refsum; Nymoen, Kristian; Havnes, Heljar & Vold, Skjalg Bøhmer (2012, 14. mars). Klarer ikke stå stille.
Universitas.
Show summary
Trodde du at du har full kontroll over kroppen din? Det har du ikke. På UiO forskes det nå på kroppens ukontrollerte bevegelser.
-
Jensenius, Alexander Refsum; Nymoen, Kristian; Skogstad, Ståle Andreas van Dorp & Voldsund, Arve (2012). A Study of the Noise-Level in Two Infrared Marker-Based Motion Capture Systems.
Show summary
With musical applications in mind, this paper reports on the level of noise observed in two commercial infrared marker-based motion capture systems: one high-end (Qualisys) and one affordable (OptiTrack). We have tested how various features (calibration volume, marker size, sampling frequency, etc.) influence the noise level of markers lying still, and fixed to subjects standing still. The conclusion is that the motion observed in humans standing still is usually considerably higher than the noise level of the systems. Dependent on the system and its calibration, however, the signal-to-noise-ratio may in some cases be problematic.
-
Jensenius, Alexander Refsum; Nymoen, Kristian; Voldsund, Arve; Straume, Christopher; Rivocantus, Dositheos; Sandvik, Bjørnar; Frogner, Alexander; Fremmerlid, Lars Gärtner & Røsten, Hauk J. (2012). OMO plays HTC.
-
Nymoen, Kristian; Voldsund, Arve; Skogstad, Ståle Andreas van Dorp; Jensenius, Alexander Refsum & Tørresen, Jim (2012). Comparing Motion Data from an iPod Touch to a High-End Optical Infrared Marker-Based Motion Capture System.
Show summary
The paper presents an analysis of the quality of motion data from an iPod Touch (4th gen.). Acceleration and orientation data derived from internal sensors of an iPod is compared to data from a high end optical infrared marker-based motion capture system (Qualisys) in terms of latency, jitter, accuracy and precision. We identify some rotational drift in the iPod, and some time lag between the two systems. Still, the iPod motion data is quite reliable, especially for describing relative motion over a short period of time.
View all works in Cristin
Published Oct. 25, 2019 12:09 PM
- Last modified June 17, 2020 3:10 PM