Charles Martin

Postdoktor
Bilde av Charles Martin
English version of this page
Mobiltelefon +47 920 14 165
Rom 4411
Brukernavn
Besøksadresse Gaustadalléen 23 B Ole-Johan Dahls hus 0373 OSLO
Postadresse Postboks 1080 Blindern 0316 OSLO

Charles Martin er en forsker i informatikk, maskin læring og elektronisk musikk. Han kobler perkusjon med elektroakustisk musikk og andre medier gjennom ny teknologi. I 2016 deltok Charles i prosjektet Engineering Prediction and Embodied Cognition (EPEC) ved Universitetet i Oslo, hvor han utvikler nye måter å forutsi musikalske intensjoner og fremføringer i smarttelefon applikasjoner.

Fra 2018 er Charles også forsker i RITMO Senter for tverrfaglig forskning på rytme, tid og bevegelse.

Forskningsinteresser:

  • NIME (New Interfaces for Musical Expression)
  • Kreativ AI
  • Musikkinstrumenter for smarttelefon
  • Kreativt samspill 
  • Improvisasjon
  • Perkusjon i elektronisk musikk

Charles veileder i masterprosjekter i EPEC-prosjektet og ROBIN-gruppen ved IFI. Prosjekter er tilgjengelige innenfor:

  • Utvikling av prediktive musikkinstrumenter
  • Maskinlæring for modellering av musikalsk stil
  • Musikalsk AI

Mer informasjon om interaktiv musikk og musikalsk AI er tilgjengelig på EPECs prosjektside. Her kan du finne nyttig informasjon dersom du vil gjøre et musikalsk masterprosjekt.

 

Publikasjoner

  • Charles P. Martin and Henry Gardner. Free-Improvised Rehearsal-as-Research for Musical HCI. In Simon Holland, Tom Mudd, Katie Wilkie-McKenna, Andrew McPherson, and Marcelo M. Wanderley, editors, New Directions in Music and Human-Computer Interaction, Springer Series on Cultural Computing, pages 269--284. Springer, Cham, February 2019. [ bib | DOI | preprint | .pdf ]
  • Charles P. Martin and Jim Torresen. RoboJam: A Musical Mixture Density Network for Collaborative Touchscreen Interaction. In Antonios Liapis, Juan Jesús Romero Cardalda, and Anikó Ekárt, editors, Computational Intelligence in Music, Sound, Art and Design: International Conference, EvoMUSART, volume 10783 of Lecture Notes in Computer Science, pages 161--176, Switzerland, April 2018. Springer International Publishing. [ bib | DOI | arXiv | video | http ]
  • Charles P. Martin. Percussionist-Centred Design for Touchscreen Digital Musical Instruments.Contemporary Music Review, 36(1--2):64--85, September 2017. [ bib | DOI | preprint ]
  • Charles P. Martin, Kai Olav Ellefsen, and Jim Torresen. Deep Models for Ensemble Touch-Screen Improvisation. In Proceedings of the 12th International Audio Mostly Conference on Augmented and Participatory Sound and Music Experiences, AM '17, August 2017. [ bib | DOI | preprint ]
  • Charles Martin, Henry Gardner, Ben Swift, and Michael Martin. 2016. Intelligent agents and networked buttons improve free-improvised ensemble music-making on touch-screensProceedings of the SIGCHI conference on human factors in computing systems, ACM. http://doi.org/10.1145/2858036.2858269
  • Charles Martin and Henry Gardner. 2016. A percussion-focussed approach to preserving touch-screen improvisation. In Curating the digital: Spaces for art and interaction, David England, Thecla Schiphorst and Nick Bryan-Kinns (eds.). Springer International Publishing, Switzerland. http://doi.org/10.1007/978-3-319-28722-5_5
  • Charles Martin, Henry Gardner, and Ben Swift. 2015. Tracking ensemble performance on touch-screens with gesture classification and transition matricesProceedings of the international conference on new interfaces for musical expressionhttp://www.nime.org/proceedings/2015/nime2015_242.pdf
  • Charles Martin, Henry Gardner, and Ben Swift. 2014. Exploring percussive gesture on iPads with Ensemble MetatoneProceedings of the SIGCHI conference on human factors in computing systems, ACM, 1025–1028. http://doi.org/10.1145/2556288.2557226
  • Charles Patrick Martin. 2016. Apps, agents, and improvisation: Ensemble interaction with touch-screen digital musical instruments. PhD Thesis http://hdl.handle.net/1885/101786​​
  • Wallace, Benedikte; Hilton, C.; Nymoen, Kristian; Tørresen, Jim; Martin, Charles & Fiebrink, Rebecca (2023). Embodying an Interactive AI for Dance Through Movement Ideation, C&C '23: Creativity and Cognition Virtual Event USA June 19 - 21, 2023. Association for Computing Machinery (ACM). ISSN 979-8-4007-0180-1. s. 454–464. doi: https:/doi.org/10.1145/3591196.3593336.
  • Wallace, Benedikte; Martin, Charles Patrick; Tørresen, Jim & Nymoen, Kristian (2021). Exploring the Effect of Sampling Strategy on Movement Generation with Generative Neural Networks, EvoMUSART 2021: Artificial Intelligence in Music, Sound, Art and Design. Springer Nature. ISSN 978-3-030-72913-4. s. 344–359. doi: 10.1007/978-3-030-72914-1_23.
  • Wallace, Benedikte; Martin, Charles Patrick; Tørresen, Jim & Nymoen, Kristian (2021). Learning Embodied Sound-Motion Mappings: Evaluating AI-Generated Dance Improvisation, C&C'21: Proceedings of the 13th Conference on Creativity and Cognition. Association for Computing Machinery (ACM). ISSN 9781450383769. doi: 10.1145/3450741.3465245.
  • Nygaard, Tønnes; Martin, Charles Patrick; Howard, David; Tørresen, Jim & Glette, Kyrre (2021). Environmental Adaptation of Robot Morphology and Control Through Real-world Evolution. Evolutionary Computation. ISSN 1063-6560. doi: 10.1162/evco_a_00291. Fulltekst i vitenarkiv
  • Nygaard, Tønnes; Martin, Charles Patrick; Tørresen, Jim; Glette, Kyrre & Howard, David (2021). Real-world embodied AI through a morphologically adaptive quadruped robot. Nature Machine Intelligence. doi: 10.1038/s42256-021-00320-3. Fulltekst i vitenarkiv
  • Martin, Charles Patrick & Tørresen, Jim (2020). Data-Driven Analysis of Tiny Touchscreen Performance with MicroJam . Computer Music Journal. ISSN 0148-9267. 43(4), s. 41–57. doi: 10.1162/comj_a_00536.
  • Martin, Charles Patrick; Glette, Kyrre; Nygaard, Tønnes & Tørresen, Jim (2020). Understanding Musical Predictions With an Embodied Interface for Musical Machine Learning. Frontiers in Artificial Intelligence. ISSN 2624-8212. 3(6). doi: 10.3389/frai.2020.00006.
  • Erdem, Cagri; Lan, Qichao; Fuhrer, Julian; Martin, Charles Patrick; Tørresen, Jim & Jensenius, Alexander Refsum (2020). Towards Playing in the 'Air': Modeling Motion-Sound Energy Relationships in Electric Guitar Performance Using Deep Neural Networks. I Spagnol, Simone & Valle, Andrea (Red.), Proceedings of the 17th Sound and Music Computing Conference. Axea sas/SMC Network. ISSN 978-88-945415-0-2. s. 177–184. Fulltekst i vitenarkiv
  • Martin, Charles Patrick & Tørresen, Jim (2019). Data Driven Analysis of Tiny Touchscreen Performance with MicroJam. Computer Music Journal. ISSN 0148-9267. 43(4). Fulltekst i vitenarkiv
  • Wallace, Benedikte; Martin, Charles Patrick & Nymoen, Kristian (2019). Tracing from Sound to Movement with Mixture Density Recurrent Neural Networks. I Coleman, Grisha (Red.), Proceedings of the 6th International Conference on Movement and Computing. ACM Publications. ISSN 978-1-4503-7654-9. doi: 10.1145/3347122.3371376.
  • Nygaard, Tønnes Frostad; Nordmoen, Jørgen Halvorsen; Ellefsen, Kai Olav; Martin, Charles Patrick; Tørresen, Jim & Glette, Kyrre (2019). Experiences from Real-World Evolution with DyRET: Dynamic Robot for Embodied Testing, Nordic Artificial Intelligence Research and Development: Third Symposium of the Norwegian AI Society, NAIS 2019. Springer. ISSN 978-3-030-35664-4. s. 58–68. doi: https:/doi.org/10.1007/978-3-030-35664-4_6.
  • Weber, Aline; Martin, Charles Patrick; Tørresen, Jim & da Silva, Bruno Castro (2019). Identifying Reusable Early-Life Options, Proceedings of the 9th Joint IEEE International Conference on Development and Learning and on Epigenetic Robotics. IEEE (Institute of Electrical and Electronics Engineers). ISSN 978-1-5386-8129-9. doi: 10.1109/DEVLRN.2019.8850725.
  • Nygaard, Tønnes Frostad; Martin, Charles Patrick; Tørresen, Jim & Glette, Kyrre (2019). Self-Modifying Morphology Experiments with DyRET: Dynamic Robot for Embodied Testing. IEEE International Conference on Robotics and Automation (ICRA). ISSN 1050-4729. 2019-May, s. 9446–9452. doi: 10.1109/ICRA.2019.8793663. Fulltekst i vitenarkiv
  • Faitas, Andrei; Baumann, Synne Engdahl; Næss, Torgrim Rudland; Tørresen, Jim & Martin, Charles Patrick (2019). Generating Convincing Harmony Parts with Simple Long Short-Term Memory Networks. I Queiroz, Marcelo & Xambo Sedo, Anna (Red.), Proceedings of the International Conference on New Interfaces for Musical Expression. Universidade Federal do Rio Grande do Sul. ISSN 2220-4792. Fulltekst i vitenarkiv
  • Martin, Charles Patrick & Tørresen, Jim (2019). An Interactive Musical Prediction System with Mixture Density Recurrent Neural Networks. I Queiroz, Marcelo & Xambo Sedo, Anna (Red.), Proceedings of the International Conference on New Interfaces for Musical Expression. Universidade Federal do Rio Grande do Sul. ISSN 2220-4792. s. 260–265. Fulltekst i vitenarkiv
  • Næss, Torgrim Rudland & Martin, Charles Patrick (2019). A Physical Intelligent Instrument using Recurrent Neural Networks. I Queiroz, Marcelo & Xambo Sedo, Anna (Red.), Proceedings of the International Conference on New Interfaces for Musical Expression. Universidade Federal do Rio Grande do Sul. ISSN 2220-4792. s. 79–82. Fulltekst i vitenarkiv
  • Nygaard, Tønnes Frostad; Martin, Charles Patrick; Tørresen, Jim & Glette, Kyrre (2019). Evolving Robots on Easy Mode: Towards a Variable Complexity Controller for Quadrupeds. Lecture Notes in Computer Science (LNCS). ISSN 0302-9743. 11454 LNCS, s. 616–632. doi: 10.1007/978-3-030-16692-2_41. Fulltekst i vitenarkiv
  • Wallace, Benedikte & Martin, Charles Patrick (2019). Comparing models for harmony prediction in an interactive audio looper. Lecture Notes in Computer Science (LNCS). ISSN 0302-9743. 11453 LNCS, s. 173–187. doi: 10.1007/978-3-030-16667-0_12.
  • Martin, Charles Patrick & Gardner, Henry (2019). Free-Improvised Rehearsal-as-Research for Musical HCI. I Holland, Simon; Mudd, Tom; Wilkie-McKenna, Katie; McPherson, Andrew & Wanderley, Marcelo M. (Red.), New Directions in Music and Human-Computer Interaction. Springer. ISSN 978-3-319-92068-9. s. 269–284. doi: https:/doi.org/10.1007/978-3-319-92069-6_17. Fulltekst i vitenarkiv
  • Nygaard, Tønnes Frostad; Martin, Charles Patrick; Samuelsen, Eivind; Tørresen, Jim & Glette, Kyrre (2018). Real-world evolution adapts robot morphology and control to hardware limitations. I aguirre, hernan (Red.), GECCO '18: Proceedings of the Genetic and Evolutionary Computation Conference. Association for Computing Machinery (ACM). ISSN 978-1-4503-5618-3. s. 125–132. doi: 10.1145/3205455.3205567. Fulltekst i vitenarkiv
  • Martin, Charles Patrick & Tørresen, Jim (2018). RoboJam: A musical mixture density network for collaborative touchscreen interaction. Lecture Notes in Computer Science (LNCS). ISSN 0302-9743. 10783 LNCS, s. 161–176. doi: 10.1007/978-3-319-77583-8_11. Fulltekst i vitenarkiv
  • Martin, Charles Patrick; Jensenius, Alexander Refsum & Tørresen, Jim (2018). Composing an ensemble standstill work for Myo and Bela. I Dahl, Luke; Bowman, Doug & Martin, Tom (Red.), Proceedings of the International Conference On New Interfaces For Musical Expression. Virginia Tech. ISSN 2220-4792. s. 196–197. Fulltekst i vitenarkiv
  • Gonzalez Sanchez, Victor Evaristo; Martin, Charles Patrick; Zelechowska, Agata; Bjerkestrand, Kari Anne Vadstensvik; Johnson, Victoria Kristine Å & Jensenius, Alexander Refsum (2018). Bela-based augmented acoustic guitars for sonic microinteraction. I Dahl, Luke; Bowman, Doug & Martin, Tom (Red.), Proceedings of the International Conference On New Interfaces For Musical Expression. Virginia Tech. ISSN 2220-4792. s. 324–327. Fulltekst i vitenarkiv
  • Martin, Charles Patrick (2017). Percussionist-Centred Design for Touchscreen Digital Musical Instruments. Contemporary Music Review. ISSN 0749-4467. 36(1-2), s. 64–85. doi: 10.1080/07494467.2017.1370794. Fulltekst i vitenarkiv
  • Martin, Charles Patrick; Ellefsen, Kai Olav & Tørresen, Jim (2017). Deep Models for Ensemble Touch-Screen Improvisation. I Fazekas, George; Barthet, Mathieu & Stockman, Tony (Red.), Proceedings of the 12th International Audio Mostly Conference: Augmented and Participatory Sound and Music Experiences. Association for Computing Machinery (ACM). ISSN 978-1-4503-5373-1. doi: 10.1145/3123514.3123556. Fulltekst i vitenarkiv
  • Martin, Charles Patrick & Tørresen, Jim (2017). Exploring Social Mobile Music with Tiny Touch-Screen Performances. I Lokki, Tapio; Pätynen, Jukka & Välimäki, Vesa (Red.), Proceedings of the 14th Sound and Music Computing Conference 2017. Aalto University. ISSN 978-952-60-3729-5. s. 175–180. Fulltekst i vitenarkiv
  • Martin, Charles Patrick & Tørresen, Jim (2017). MicroJam: An App for Sharing Tiny Touch-Screen Performances. I Erkut, Cumhur (Red.), Proceedings of the International Conference on New Interfaces for Musical Expression. Aalborg University Copenhagen. ISSN 2220-4792. s. 495–496. Fulltekst i vitenarkiv

Se alle arbeider i Cristin

  • Wallace, Benedikte; Nymoen, Kristian; Martin, Charles Patrick & Tørresen, Jim (2020). Towards Movement Generation with Audio Features.
  • Wallace, Benedikte; Nymoen, Kristian & Martin, Charles Patrick (2019). Tracing from Sound to Movement with Mixture Density Recurrent Neural Networks.
  • Næss, Torgrim Rudland; Tørresen, Jim & Martin, Charles Patrick (2019). A Physical Intelligent Instrument using Recurrent Neural Networks.
  • Martin, Charles Patrick & Torresen, Jim (2019). An Interactive Music Prediction System with Mixture Density Recurrent Neural Networks.
  • Faitas, Andrei; Baumann, Synne Engdahl; Torresen, Jim & Martin, Charles Patrick (2019). Generating Convincing Harmony Parts with Simple Long Short-Term Memory Networks.
  • Martin, Charles Patrick; Næss, Torgrim Rudland; Faitas, Andrei & Baumann, Synne Engdahl (2019). Session on Musical Prediction and Generation with Deep Learning.
  • Martin, Charles Patrick & Tørresen, Jim (2019). An Interactive Musical Prediction System with Mixture Density Recurrent Neural Networks.
  • Martin, Charles Patrick (2019). Workshop on Making Predictive NIMEs with Neural Networks.
  • Nygaard, Tønnes Frostad; Nordmoen, Jørgen Halvorsen; Martin, Charles Patrick; Tørresen, Jim & Glette, Kyrre (2019). Lessons Learned from Real-World Experiments with DyRET: the Dynamic Robot for Embodied Testing.
  • Nygaard, Tønnes Frostad; Martin, Charles Patrick; Tørresen, Jim & Glette, Kyrre (2019). Self-Modifying Morphology Experiments with DyRET: Dynamic Robot for Embodied Testing.
  • Nygaard, Tønnes Frostad; Nordmoen, Jørgen Halvorsen; Ellefsen, Kai Olav; Martin, Charles Patrick; Tørresen, Jim & Glette, Kyrre (2019). Experiences from Real-World Evolution with DyRET: Dynamic Robot for Embodied Testing.
  • Jensenius, Alexander Refsum; Martin, Charles Patrick; Erdem, Cagri; Lan, Qichao; Fuhrer, Julian Peter & Gonzalez Sanchez, Victor Evaristo [Vis alle 8 forfattere av denne artikkelen] (2019). Self-playing Guitars.
  • Bergsland, Andreas; Sullivan, John; Cagri, Erdem & Martin, Charles Patrick (2018). Interactive music systems.
  • Martin, Charles Patrick & Tørresen, Jim (2018). Predictive Musical Interaction with MDRNNs.
  • Søyseth, Vegard Dønnem; Nygaard, Tønnes Frostad; Martin, Charles Patrick; Uddin, Md Zia & Ellefsen, Kai Olav (2018). ROBIN-Stand ved Cutting Edge 2018.
  • Martin, Charles Patrick (2018). Creative Prediction with Neural Networks.
  • Martin, Charles Patrick; Lesteberg, Mari; Jawad, Karolina; Aandahl, Eigil; Xambó, Anna & Jensenius, Alexander Refsum (2018). Stillness under Tension.
  • Martin, Charles Patrick; Gonzalez Sanchez, Victor Evaristo; Zelechowska, Agata; Erdem, Cagri & Jensenius, Alexander Refsum (2018). Stillness under Tension.
  • Tørresen, Jim; Garcia Ceja, Enrique Alejandro; Ellefsen, Kai Olav & Martin, Charles Patrick (2018). Equipping Systems with Forecasting Capabilities .
  • Martin, Charles Patrick (2018). Deep Predictive Models in Interactive Music.
  • Martin, Charles Patrick; Glette, Kyrre; Nygaard, Tønnes Frostad & Tørresen, Jim (2018). Self-Awareness in a Cyber-Physical Predictive Musical Interface.
  • Nygaard, Tønnes Frostad; Martin, Charles Patrick; Tørresen, Jim & Glette, Kyrre (2018). Exploring Mechanically Self-Reconfiguring Robots for Autonomous Design.
  • Martin, Charles Patrick (2018). MicroJam.
  • Garcia Ceja, Enrique Alejandro; Ellefsen, Kai Olav; Martin, Charles Patrick & Tørresen, Jim (2018). Prediction, Interaction, and User Behaviour.
  • Martin, Charles Patrick; Glette, Kyrre & Tørresen, Jim (2018). Creative Prediction with Neural Networks.
  • Martin, Charles Patrick (2018). Predictive Music Systems for Interactive Performance.
  • Jensenius, Alexander Refsum; Martin, Charles Patrick; Bjerkestrand, Kari Anne Vadstensvik & Johnson, Victoria (2018). Stillness under Tension.
  • Martin, Charles Patrick; Jensenius, Alexander Refsum & Tørresen, Jim (2018). Composing an ensemble standstill work for Myo and Bela.
  • Martin, Charles Patrick; Xambó, Anna; Visi, Federico; Morreale, Fabio & Jensenius, Alexander Refsum (2018). Stillness under Tension.
  • Gonzalez Sanchez, Victor Evaristo; Martin, Charles Patrick; Zelechowska, Agata; Bjerkestrand, Kari Anne Vadstensvik; Johnson, Victoria & Jensenius, Alexander Refsum (2018). Bela-based augmented acoustic guitars for sonic microinteraction.
  • Jack, Robert; Jensenius, Alexander Refsum; Gonzalez Sanchez, Victor Evaristo; Bjerkestrand, Kari Anne Vadstensvik; Zelechowska, Agata & Martin, Charles Patrick [Vis alle 7 forfattere av denne artikkelen] (2018). Sverm-Resonans: interactive installation with resonating guitars and Bela. [Internett]. Bela blog.
  • Jensenius, Alexander Refsum; Martin, Charles Patrick; Bjerkestrand, Kari Anne Vadstensvik & Johnson, Victoria (2017). Sverm-Muscle.
  • Jensenius, Alexander Refsum; Kvifte, Tellef; Innervik, Kjell Tore; Martin, Charles Patrick; Brøvig-Hanssen, Ragnhild & Lossius, Trond [Vis alle 7 forfattere av denne artikkelen] (2017). Panel: New Interfaces for Musical Expression.
  • Jensenius, Alexander Refsum; Martin, Charles Patrick; Gonzalez Sanchez, Victor Evaristo; Zelechowska, Agata & Johnson, Victoria (2017). Sverm-Resonans.
  • Jensenius, Alexander Refsum; Bjerkestrand, Kari Anne Vadstensvik; Johnson, Victoria; Gonzalez Sanchez, Victor Evaristo; Zelechowska, Agata & Martin, Charles Patrick (2017). Sverm-resonans.
  • Jensenius, Alexander Refsum; Bjerkestrand, Kari Anne Vadstensvik; Johnson, Victoria; Gonzalez Sanchez, Victor Evaristo; Zelechowska, Agata & Martin, Charles Patrick (2017). Sverm-Resonans.
  • Martin, Charles Patrick; Gonzalez Sanchez, Victor Evaristo; Kelkar, Tejaswinee; Zelechowska, Agata; Berggren, Stig Johan & Hopgood, Christina [Vis alle 13 forfattere av denne artikkelen] (2017). Ensemble Metatone - Improvised Touchscreen Performance.
  • Martin, Charles Patrick (2017). Musical Networks: Using Recurrent Neural Networks to Model and Complement Musical Creativity.
  • Martin, Charles Patrick (2017). Making Social Music with MicroJam.
  • Jensenius, Alexander Refsum; Bjerkestrand, Kari Anne Vadstensvik; Johnson, Victoria; Gonzalez Sanchez, Victor Evaristo; Zelechowska, Agata & Martin, Charles Patrick (2017). Sverm-Resonans.
  • Jensenius, Alexander Refsum; Bjerkestrand, Kari Anne Vadstensvik; Johnson, Victoria; Gonzalez Sanchez, Victor Evaristo; Zelechowska, Agata & Martin, Charles Patrick (2017). Sverm-Puls.
  • Jensenius, Alexander Refsum; Bjerkestrand, Kari Anne Vadstensvik; Johnson, Victoria; Gonzalez Sanchez, Victor Evaristo; Zelechowska, Agata & Martin, Charles Patrick (2017). Sverm-Resonans.
  • Martin, Charles Patrick (2017). Virtuosic Interactions / Performing with a Neural iPad Band.
  • Martin, Charles Patrick (2017). MicroJam: A Social App for Making Music.
  • Martin, Charles Patrick (2017). Pursuing a Sonigraphical Ideal at the Dawn of the NIME Epoch. I Jensenius, Alexander Refsum & Lyons, Michael J. (Red.), A NIME Reader: Fifteen Years of New Interfaces for Musical Expression. Springer Science+Business Media B.V.. ISSN 978-3-319-47213-3. s. 103–105. doi: 10.1007/978-3-319-47214-0.
  • Nygaard, Tønnes; Glette, Kyrre; Tørresen, Jim & Martin, Charles Patrick (2020). Legging It: An Evolutionary Approach to Morphological Adaptation for a Real-World Quadruped Robot. Universitetet i Oslo. ISSN 1501-7710. Fulltekst i vitenarkiv
  • Næss, Torgrim Rudland; Martin, Charles Patrick & Tørresen, Jim (2019). A Physical Intelligent Instrument using Recurrent Neural Networks. Universitetet i Oslo.
  • Wallace, Benedikte & Martin, Charles Patrick (2018). Predictive songwriting with concatenative accompaniment. Universitetet i Oslo.
  • Brustad, Henrik & Martin, Charles Patrick (2018). Digital Audio Generation with Neural Networks. Universitetet i Oslo.

Se alle arbeider i Cristin

Publisert 22. aug. 2016 14:14 - Sist endret 17. des. 2019 11:01

Prosjekter

Ingen pågående prosjekter