Charles Martin

Postdoctoral Fellow
Image of Charles Martin
Norwegian version of this page
Mobile phone +47 920 14 165
Room 4411
Username
Visiting address Gaustadalléen 23 B Ole-Johan Dahls hus 0373 OSLO
Postal address Postboks 1080 Blindern 0316 OSLO

Charles Martin is a specialist in percussion, computer music, and interactive media from Canberra, Australia. He links percussion with electroacoustic music and other media through new technologies. In 2016, Charles joined the Engineering Prediction and Embodied Cognition (EPEC) project at the University of Oslo, where he is developing new ways to predict musical intentions and performances in smartphone apps. 

From 2018, Charles is also a researcher in the RITMO Centre for Interdisciplinary Studies in Rhythm, Time, and Motion.

Research Interests:

  • new interfaces for musical performance
  • computational creativity
  • smartphone/tablet musical instruments
  • collaborative performance
  • co-creative interfaces
  • improvisation
  • percussive approaches to computer music

Supervision:

Charles supervises masters projects in the EPEC project and ROBIN group at IFI. Projects are available in:

  • developing predictive musical instruments
  • machine learning of musical style
  • musical AI
  • computer support for collaborative musical expression

More information on interactive music and musical AI is available at the EPEC project page. Here's some helpful slides about doing musical masters projects.

Tags: Music Technology, HCI, Intelligent Systems, Mobile Applications, Improvisation, Collaborative Creativity, Percussion

Publications

Journal Articles

Book Chapters

  • Charles Martin and Henry Gardner. 2016. A percussion-focussed approach to preserving touch-screen improvisation. In Curating the digital: Spaces for art and interaction, David England, Thecla Schiphorst and Nick Bryan-Kinns (eds.). Springer International Publishing, Switzerland. http://doi.org/10.1007/978-3-319-28722-5_5
  • Charles P. Martin and Henry Gardner. 2018. Free-improvised rehearsal-as-research for musical HCI (in press). In Music and HCI, Tom Mudd, Simon Holland, Katie Wilkie, Andrew McPherson and Marcelo M. Wanderley (eds.). Springer.

Papers in Conference Proceedings

  • Charles P. Martin and Jim Torresen. 2018. RoboJam: A musical mixture density network for collaborative touchscreen interaction. Computational intelligence in music, sound, art and design: International conference, evomusart (in press), Springer. http://arxiv.org/abs/1711.10746
  • Charles P. Martin, Kai Olav Ellefsen, and Jim Torresen. 2017. Deep models for ensemble touch-screen improvisation. Proceedings of AM ’17. http://doi.org/10.1145/3123514.3123556 (postprint download)
  • Charles P. Martin and Jim Torresen. 2017. Exploring social mobile music with tiny touch-screen performances. Proceedings of the 14th sound and music computing conference, Aalto University, 175–180. http://smc2017.aalto.fi/media/materials/proceedings/SMC17_p175.pdf
  • Charles P. Martin and Jim Torresen. 2017. MicroJam: An app for sharing tiny touch-screen performances. Proceedings of the international conference on new interfaces for musical expression, Aalborg University Copenhagen, 495–496. http://urn.nb.no/URN:NBN:no-58823
  • Charles Martin, Henry Gardner, Ben Swift, and Michael Martin. 2016. Intelligent agents and networked buttons improve free-improvised ensemble music-making on touch-screens. Proceedings of the SIGCHI conference on human factors in computing systems, ACM. http://doi.org/10.1145/2858036.2858269
  • Charles Martin, Henry Gardner, Ben Swift, and Michael Martin. 2015. Music of 18 performances: Evaluating apps and agents with free improvisation. Proceedings of the 2015 conference of the Australasian Computer Music Association, Australasian Computer Music Association, 85–94. http://hdl.handle.net/1885/95205
  • Charles Martin and Henry Gardner. 2015. That syncing feeling: Networked strategies for enabling ensemble creativity in iPad musicians. Proceedings of CreateWorld, Griffith University. http://hdl.handle.net/1885/95216
  • Charles Martin, Henry Gardner, and Ben Swift. 2015. Tracking ensemble performance on touch-screens with gesture classification and transition matrices. Proceedings of the international conference on new interfaces for musical expression. http://www.nime.org/proceedings/2015/nime2015_242.pdf
  • Charles Martin, Henry Gardner, and Ben Swift. 2014. MetaTravels and MetaLonsdale: IPad apps for percussive improvisation. CHI ’14 extended abstracts on human factors in computing systems, ACM, 547–550. http://doi.org/10.1145/2559206.2574805
  • Charles Martin. 2014. Making improvised music for iPad and percussion with Ensemble Metatone. Proceedings of the Australasian computer music conference, 115–118. http://hdl.handle.net/1885/95314
  • Charles Martin, Henry Gardner, and Ben Swift. 2014. Exploring percussive gesture on iPads with Ensemble Metatone. Proceedings of the SIGCHI conference on human factors in computing systems, ACM, 1025–1028. http://doi.org/10.1145/2556288.2557226
  • Charles Martin. 2013. Integrating mobile music with percussion performance practice. Proceedings of the international computer music conference, 437–440. http://hdl.handle.net/2027/spo.bbp2372.2013.073
  • Charles Martin. 2013. Performing with a mobile computer system for vibraphone. Proceedings of the international conference on new interfaces for musical expression, Graduate School of Culture Technology, KAIST, 377–380. http://nime.org/proceedings/2013/nime2013_121.pdf
  • Charles Martin. 2012. Creating mobile computer music for percussionists: Snow music. Proceedings of the Australasian computer music conference, Australasian Computer Music Association. http://doi.org/10.13140/RG.2.1.5150.5687
  • Charles Martin and Chi-Hsia Lai. 2011. Strike on Stage: A percussion and media performance. Proceedings of the international conference on new interfaces for musical expression, 142–143. http://www.nime.org/proceedings/2011/nime2011_142.pdf
  • Charles Martin, Benjamin Forster, and Hanna Cormick. 2010. Audience interactive performance in “The Last Man to Die". Proceedings of the Australasian computer music conference, Australasian Computer Music Association, 89–91. http://hdl.handle.net/1885/101945
  • Charles Martin, Benjamin Forster, and Hanna Cormick. 2010. Cross-artform performance using networked interfaces: Last Man to Die’s Vital LMTD. Proceedings of the international conference on new interfaces for musical expression, 204–207. http://www.nime.org/proceedings/2010/nime2010_204.pdf

Theses

  • Wallace, Benedikte; Hilton, C.; Nymoen, Kristian; Tørresen, Jim; Martin, Charles & Fiebrink, Rebecca (2023). Embodying an Interactive AI for Dance Through Movement Ideation, C&C '23: Creativity and Cognition Virtual Event USA June 19 - 21, 2023. Association for Computing Machinery (ACM). ISSN 979-8-4007-0180-1. p. 454–464. doi: https:/doi.org/10.1145/3591196.3593336.
  • Wallace, Benedikte; Martin, Charles Patrick; Tørresen, Jim & Nymoen, Kristian (2021). Exploring the Effect of Sampling Strategy on Movement Generation with Generative Neural Networks, EvoMUSART 2021: Artificial Intelligence in Music, Sound, Art and Design. Springer Nature. ISSN 978-3-030-72913-4. p. 344–359. doi: 10.1007/978-3-030-72914-1_23.
  • Wallace, Benedikte; Martin, Charles Patrick; Tørresen, Jim & Nymoen, Kristian (2021). Learning Embodied Sound-Motion Mappings: Evaluating AI-Generated Dance Improvisation, C&C'21: Proceedings of the 13th Conference on Creativity and Cognition. Association for Computing Machinery (ACM). ISSN 9781450383769. doi: 10.1145/3450741.3465245.
  • Nygaard, Tønnes; Martin, Charles Patrick; Howard, David; Tørresen, Jim & Glette, Kyrre (2021). Environmental Adaptation of Robot Morphology and Control Through Real-world Evolution. Evolutionary Computation. ISSN 1063-6560. doi: 10.1162/evco_a_00291. Full text in Research Archive
  • Nygaard, Tønnes; Martin, Charles Patrick; Tørresen, Jim; Glette, Kyrre & Howard, David (2021). Real-world embodied AI through a morphologically adaptive quadruped robot. Nature Machine Intelligence. doi: 10.1038/s42256-021-00320-3. Full text in Research Archive
  • Martin, Charles Patrick & Tørresen, Jim (2020). Data-Driven Analysis of Tiny Touchscreen Performance with MicroJam . Computer Music Journal. ISSN 0148-9267. 43(4), p. 41–57. doi: 10.1162/comj_a_00536.
  • Martin, Charles Patrick; Glette, Kyrre; Nygaard, Tønnes & Tørresen, Jim (2020). Understanding Musical Predictions With an Embodied Interface for Musical Machine Learning. Frontiers in Artificial Intelligence. ISSN 2624-8212. 3(6). doi: 10.3389/frai.2020.00006.
  • Erdem, Cagri; Lan, Qichao; Fuhrer, Julian; Martin, Charles Patrick; Tørresen, Jim & Jensenius, Alexander Refsum (2020). Towards Playing in the 'Air': Modeling Motion-Sound Energy Relationships in Electric Guitar Performance Using Deep Neural Networks. In Spagnol, Simone & Valle, Andrea (Ed.), Proceedings of the 17th Sound and Music Computing Conference. Axea sas/SMC Network. ISSN 978-88-945415-0-2. p. 177–184. Full text in Research Archive
  • Martin, Charles Patrick & Tørresen, Jim (2019). Data Driven Analysis of Tiny Touchscreen Performance with MicroJam. Computer Music Journal. ISSN 0148-9267. 43(4). Full text in Research Archive
  • Wallace, Benedikte; Martin, Charles Patrick & Nymoen, Kristian (2019). Tracing from Sound to Movement with Mixture Density Recurrent Neural Networks. In Coleman, Grisha (Eds.), Proceedings of the 6th International Conference on Movement and Computing. ACM Publications. ISSN 978-1-4503-7654-9. doi: 10.1145/3347122.3371376.
  • Nygaard, Tønnes Frostad; Nordmoen, Jørgen Halvorsen; Ellefsen, Kai Olav; Martin, Charles Patrick; Tørresen, Jim & Glette, Kyrre (2019). Experiences from Real-World Evolution with DyRET: Dynamic Robot for Embodied Testing, Nordic Artificial Intelligence Research and Development: Third Symposium of the Norwegian AI Society, NAIS 2019. Springer. ISSN 978-3-030-35664-4. p. 58–68. doi: https:/doi.org/10.1007/978-3-030-35664-4_6.
  • Weber, Aline; Martin, Charles Patrick; Tørresen, Jim & da Silva, Bruno Castro (2019). Identifying Reusable Early-Life Options, Proceedings of the 9th Joint IEEE International Conference on Development and Learning and on Epigenetic Robotics. IEEE (Institute of Electrical and Electronics Engineers). ISSN 978-1-5386-8129-9. doi: 10.1109/DEVLRN.2019.8850725.
  • Nygaard, Tønnes Frostad; Martin, Charles Patrick; Tørresen, Jim & Glette, Kyrre (2019). Self-Modifying Morphology Experiments with DyRET: Dynamic Robot for Embodied Testing. IEEE International Conference on Robotics and Automation (ICRA). ISSN 1050-4729. 2019-May, p. 9446–9452. doi: 10.1109/ICRA.2019.8793663. Full text in Research Archive
  • Faitas, Andrei; Baumann, Synne Engdahl; Næss, Torgrim Rudland; Tørresen, Jim & Martin, Charles Patrick (2019). Generating Convincing Harmony Parts with Simple Long Short-Term Memory Networks. In Queiroz, Marcelo & Xambo Sedo, Anna (Ed.), Proceedings of the International Conference on New Interfaces for Musical Expression. Universidade Federal do Rio Grande do Sul. ISSN 2220-4792. Full text in Research Archive
  • Martin, Charles Patrick & Tørresen, Jim (2019). An Interactive Musical Prediction System with Mixture Density Recurrent Neural Networks. In Queiroz, Marcelo & Xambo Sedo, Anna (Ed.), Proceedings of the International Conference on New Interfaces for Musical Expression. Universidade Federal do Rio Grande do Sul. ISSN 2220-4792. p. 260–265. Full text in Research Archive
  • Næss, Torgrim Rudland & Martin, Charles Patrick (2019). A Physical Intelligent Instrument using Recurrent Neural Networks. In Queiroz, Marcelo & Xambo Sedo, Anna (Ed.), Proceedings of the International Conference on New Interfaces for Musical Expression. Universidade Federal do Rio Grande do Sul. ISSN 2220-4792. p. 79–82. Full text in Research Archive
  • Nygaard, Tønnes Frostad; Martin, Charles Patrick; Tørresen, Jim & Glette, Kyrre (2019). Evolving Robots on Easy Mode: Towards a Variable Complexity Controller for Quadrupeds. Lecture Notes in Computer Science (LNCS). ISSN 0302-9743. 11454 LNCS, p. 616–632. doi: 10.1007/978-3-030-16692-2_41. Full text in Research Archive
  • Wallace, Benedikte & Martin, Charles Patrick (2019). Comparing models for harmony prediction in an interactive audio looper. Lecture Notes in Computer Science (LNCS). ISSN 0302-9743. 11453 LNCS, p. 173–187. doi: 10.1007/978-3-030-16667-0_12.
  • Martin, Charles Patrick & Gardner, Henry (2019). Free-Improvised Rehearsal-as-Research for Musical HCI. In Holland, Simon; Mudd, Tom; Wilkie-McKenna, Katie; McPherson, Andrew & Wanderley, Marcelo M. (Ed.), New Directions in Music and Human-Computer Interaction. Springer. ISSN 978-3-319-92068-9. p. 269–284. doi: https:/doi.org/10.1007/978-3-319-92069-6_17. Full text in Research Archive
  • Nygaard, Tønnes Frostad; Martin, Charles Patrick; Samuelsen, Eivind; Tørresen, Jim & Glette, Kyrre (2018). Real-world evolution adapts robot morphology and control to hardware limitations. In aguirre, hernan (Eds.), GECCO '18: Proceedings of the Genetic and Evolutionary Computation Conference. Association for Computing Machinery (ACM). ISSN 978-1-4503-5618-3. p. 125–132. doi: 10.1145/3205455.3205567. Full text in Research Archive
  • Martin, Charles Patrick & Tørresen, Jim (2018). RoboJam: A musical mixture density network for collaborative touchscreen interaction. Lecture Notes in Computer Science (LNCS). ISSN 0302-9743. 10783 LNCS, p. 161–176. doi: 10.1007/978-3-319-77583-8_11. Full text in Research Archive
  • Martin, Charles Patrick; Jensenius, Alexander Refsum & Tørresen, Jim (2018). Composing an ensemble standstill work for Myo and Bela. In Dahl, Luke; Bowman, Doug & Martin, Tom (Ed.), Proceedings of the International Conference On New Interfaces For Musical Expression. Virginia Tech. ISSN 2220-4792. p. 196–197. Full text in Research Archive
  • Gonzalez Sanchez, Victor Evaristo; Martin, Charles Patrick; Zelechowska, Agata; Bjerkestrand, Kari Anne Vadstensvik; Johnson, Victoria Kristine Å & Jensenius, Alexander Refsum (2018). Bela-based augmented acoustic guitars for sonic microinteraction. In Dahl, Luke; Bowman, Doug & Martin, Tom (Ed.), Proceedings of the International Conference On New Interfaces For Musical Expression. Virginia Tech. ISSN 2220-4792. p. 324–327. Full text in Research Archive
  • Martin, Charles Patrick (2017). Percussionist-Centred Design for Touchscreen Digital Musical Instruments. Contemporary Music Review. ISSN 0749-4467. 36(1-2), p. 64–85. doi: 10.1080/07494467.2017.1370794. Full text in Research Archive
  • Martin, Charles Patrick; Ellefsen, Kai Olav & Tørresen, Jim (2017). Deep Models for Ensemble Touch-Screen Improvisation. In Fazekas, George; Barthet, Mathieu & Stockman, Tony (Ed.), Proceedings of the 12th International Audio Mostly Conference: Augmented and Participatory Sound and Music Experiences. Association for Computing Machinery (ACM). ISSN 978-1-4503-5373-1. doi: 10.1145/3123514.3123556. Full text in Research Archive
  • Martin, Charles Patrick & Tørresen, Jim (2017). Exploring Social Mobile Music with Tiny Touch-Screen Performances. In Lokki, Tapio; Pätynen, Jukka & Välimäki, Vesa (Ed.), Proceedings of the 14th Sound and Music Computing Conference 2017. Aalto University. ISSN 978-952-60-3729-5. p. 175–180. Full text in Research Archive
  • Martin, Charles Patrick & Tørresen, Jim (2017). MicroJam: An App for Sharing Tiny Touch-Screen Performances. In Erkut, Cumhur (Eds.), Proceedings of the International Conference on New Interfaces for Musical Expression. Aalborg University Copenhagen. ISSN 2220-4792. p. 495–496. Full text in Research Archive

View all works in Cristin

  • Wallace, Benedikte; Nymoen, Kristian; Martin, Charles Patrick & Tørresen, Jim (2020). Towards Movement Generation with Audio Features.
  • Wallace, Benedikte; Nymoen, Kristian & Martin, Charles Patrick (2019). Tracing from Sound to Movement with Mixture Density Recurrent Neural Networks.
  • Næss, Torgrim Rudland; Tørresen, Jim & Martin, Charles Patrick (2019). A Physical Intelligent Instrument using Recurrent Neural Networks.
  • Martin, Charles Patrick & Torresen, Jim (2019). An Interactive Music Prediction System with Mixture Density Recurrent Neural Networks.
  • Faitas, Andrei; Baumann, Synne Engdahl; Torresen, Jim & Martin, Charles Patrick (2019). Generating Convincing Harmony Parts with Simple Long Short-Term Memory Networks.
  • Martin, Charles Patrick; Næss, Torgrim Rudland; Faitas, Andrei & Baumann, Synne Engdahl (2019). Session on Musical Prediction and Generation with Deep Learning.
  • Martin, Charles Patrick & Tørresen, Jim (2019). An Interactive Musical Prediction System with Mixture Density Recurrent Neural Networks.
  • Martin, Charles Patrick (2019). Workshop on Making Predictive NIMEs with Neural Networks.
  • Nygaard, Tønnes Frostad; Nordmoen, Jørgen Halvorsen; Martin, Charles Patrick; Tørresen, Jim & Glette, Kyrre (2019). Lessons Learned from Real-World Experiments with DyRET: the Dynamic Robot for Embodied Testing.
  • Nygaard, Tønnes Frostad; Martin, Charles Patrick; Tørresen, Jim & Glette, Kyrre (2019). Self-Modifying Morphology Experiments with DyRET: Dynamic Robot for Embodied Testing.
  • Nygaard, Tønnes Frostad; Nordmoen, Jørgen Halvorsen; Ellefsen, Kai Olav; Martin, Charles Patrick; Tørresen, Jim & Glette, Kyrre (2019). Experiences from Real-World Evolution with DyRET: Dynamic Robot for Embodied Testing.
  • Jensenius, Alexander Refsum; Martin, Charles Patrick; Erdem, Cagri; Lan, Qichao; Fuhrer, Julian Peter & Gonzalez Sanchez, Victor Evaristo [Show all 8 contributors for this article] (2019). Self-playing Guitars.
  • Bergsland, Andreas; Sullivan, John; Cagri, Erdem & Martin, Charles Patrick (2018). Interactive music systems.
  • Martin, Charles Patrick & Tørresen, Jim (2018). Predictive Musical Interaction with MDRNNs.
  • Søyseth, Vegard Dønnem; Nygaard, Tønnes Frostad; Martin, Charles Patrick; Uddin, Md Zia & Ellefsen, Kai Olav (2018). ROBIN-Stand ved Cutting Edge 2018.
  • Martin, Charles Patrick (2018). Creative Prediction with Neural Networks.
  • Martin, Charles Patrick; Lesteberg, Mari; Jawad, Karolina; Aandahl, Eigil; Xambó, Anna & Jensenius, Alexander Refsum (2018). Stillness under Tension.
  • Martin, Charles Patrick; Gonzalez Sanchez, Victor Evaristo; Zelechowska, Agata; Erdem, Cagri & Jensenius, Alexander Refsum (2018). Stillness under Tension.
  • Tørresen, Jim; Garcia Ceja, Enrique Alejandro; Ellefsen, Kai Olav & Martin, Charles Patrick (2018). Equipping Systems with Forecasting Capabilities .
  • Martin, Charles Patrick (2018). Deep Predictive Models in Interactive Music.
  • Martin, Charles Patrick; Glette, Kyrre; Nygaard, Tønnes Frostad & Tørresen, Jim (2018). Self-Awareness in a Cyber-Physical Predictive Musical Interface.
  • Nygaard, Tønnes Frostad; Martin, Charles Patrick; Tørresen, Jim & Glette, Kyrre (2018). Exploring Mechanically Self-Reconfiguring Robots for Autonomous Design.
  • Martin, Charles Patrick (2018). MicroJam.
  • Garcia Ceja, Enrique Alejandro; Ellefsen, Kai Olav; Martin, Charles Patrick & Tørresen, Jim (2018). Prediction, Interaction, and User Behaviour.
  • Martin, Charles Patrick; Glette, Kyrre & Tørresen, Jim (2018). Creative Prediction with Neural Networks.
  • Martin, Charles Patrick (2018). Predictive Music Systems for Interactive Performance.
  • Jensenius, Alexander Refsum; Martin, Charles Patrick; Bjerkestrand, Kari Anne Vadstensvik & Johnson, Victoria (2018). Stillness under Tension.
  • Martin, Charles Patrick; Jensenius, Alexander Refsum & Tørresen, Jim (2018). Composing an ensemble standstill work for Myo and Bela.
  • Martin, Charles Patrick; Xambó, Anna; Visi, Federico; Morreale, Fabio & Jensenius, Alexander Refsum (2018). Stillness under Tension.
  • Gonzalez Sanchez, Victor Evaristo; Martin, Charles Patrick; Zelechowska, Agata; Bjerkestrand, Kari Anne Vadstensvik; Johnson, Victoria & Jensenius, Alexander Refsum (2018). Bela-based augmented acoustic guitars for sonic microinteraction.
  • Jack, Robert; Jensenius, Alexander Refsum; Gonzalez Sanchez, Victor Evaristo; Bjerkestrand, Kari Anne Vadstensvik; Zelechowska, Agata & Martin, Charles Patrick [Show all 7 contributors for this article] (2018). Sverm-Resonans: interactive installation with resonating guitars and Bela. [Internet]. Bela blog.
  • Jensenius, Alexander Refsum; Martin, Charles Patrick; Bjerkestrand, Kari Anne Vadstensvik & Johnson, Victoria (2017). Sverm-Muscle.
  • Jensenius, Alexander Refsum; Kvifte, Tellef; Innervik, Kjell Tore; Martin, Charles Patrick; Brøvig-Hanssen, Ragnhild & Lossius, Trond [Show all 7 contributors for this article] (2017). Panel: New Interfaces for Musical Expression.
  • Jensenius, Alexander Refsum; Martin, Charles Patrick; Gonzalez Sanchez, Victor Evaristo; Zelechowska, Agata & Johnson, Victoria (2017). Sverm-Resonans.
  • Jensenius, Alexander Refsum; Bjerkestrand, Kari Anne Vadstensvik; Johnson, Victoria; Gonzalez Sanchez, Victor Evaristo; Zelechowska, Agata & Martin, Charles Patrick (2017). Sverm-resonans.
  • Jensenius, Alexander Refsum; Bjerkestrand, Kari Anne Vadstensvik; Johnson, Victoria; Gonzalez Sanchez, Victor Evaristo; Zelechowska, Agata & Martin, Charles Patrick (2017). Sverm-Resonans.
  • Martin, Charles Patrick; Gonzalez Sanchez, Victor Evaristo; Kelkar, Tejaswinee; Zelechowska, Agata; Berggren, Stig Johan & Hopgood, Christina [Show all 13 contributors for this article] (2017). Ensemble Metatone - Improvised Touchscreen Performance.
  • Martin, Charles Patrick (2017). Musical Networks: Using Recurrent Neural Networks to Model and Complement Musical Creativity.
  • Martin, Charles Patrick (2017). Making Social Music with MicroJam.
  • Jensenius, Alexander Refsum; Bjerkestrand, Kari Anne Vadstensvik; Johnson, Victoria; Gonzalez Sanchez, Victor Evaristo; Zelechowska, Agata & Martin, Charles Patrick (2017). Sverm-Resonans.
  • Jensenius, Alexander Refsum; Bjerkestrand, Kari Anne Vadstensvik; Johnson, Victoria; Gonzalez Sanchez, Victor Evaristo; Zelechowska, Agata & Martin, Charles Patrick (2017). Sverm-Puls.
  • Jensenius, Alexander Refsum; Bjerkestrand, Kari Anne Vadstensvik; Johnson, Victoria; Gonzalez Sanchez, Victor Evaristo; Zelechowska, Agata & Martin, Charles Patrick (2017). Sverm-Resonans.
  • Martin, Charles Patrick (2017). Virtuosic Interactions / Performing with a Neural iPad Band.
  • Martin, Charles Patrick (2017). MicroJam: A Social App for Making Music.
  • Martin, Charles Patrick (2017). Pursuing a Sonigraphical Ideal at the Dawn of the NIME Epoch. In Jensenius, Alexander Refsum & Lyons, Michael J. (Ed.), A NIME Reader: Fifteen Years of New Interfaces for Musical Expression. Springer Science+Business Media B.V.. ISSN 978-3-319-47213-3. p. 103–105. doi: 10.1007/978-3-319-47214-0.
  • Nygaard, Tønnes; Glette, Kyrre; Tørresen, Jim & Martin, Charles Patrick (2020). Legging It: An Evolutionary Approach to Morphological Adaptation for a Real-World Quadruped Robot. Universitetet i Oslo. ISSN 1501-7710. Full text in Research Archive
  • Næss, Torgrim Rudland; Martin, Charles Patrick & Tørresen, Jim (2019). A Physical Intelligent Instrument using Recurrent Neural Networks. Universitetet i Oslo.
  • Wallace, Benedikte & Martin, Charles Patrick (2018). Predictive songwriting with concatenative accompaniment. Universitetet i Oslo.
  • Brustad, Henrik & Martin, Charles Patrick (2018). Digital Audio Generation with Neural Networks. Universitetet i Oslo.

View all works in Cristin

Published Aug. 22, 2016 2:15 PM - Last modified Dec. 17, 2019 10:58 AM

Projects

No ongoing projects