Robotics/Music: Prediction for Robots and Interactive Music (multiple projects)

The master projects outlined below are a part of the research project Engineering Predictability with Embodied Cognition (EPEC).

Overall project outline

Humans are superior to computers when it comes using senses and learned knowledge to choose the best actions. This project aims to develop  human-inspired models of prediction and apply them in robot and interactive music systems. The goal is to develop predictive models as an alternative to the more traditional reactive systems. We apply these models in embedded and mobile systems and in the fields of music technology and robotics.

A mobile interactive music app and a quadruped robot that are use-cases for machine learning and prediction.

Applying machine learning for prediction in interactive music and robotics.

Master projects in Learning Models and Predictions

Learning to predict by observing physical systems

It has recently been demonstrated that Deep Learning can be applied to predict how physical systems behave, simply by observing a large amount of videos of these systems. So far, this has mainly been explored in simple simulations of physical systems, such as simulated billiard tables or simulations of gravitational forces. However, since these systems learn directly from video, they should be able to learn also real-world dynamics. This project will explore the ability of such Visual Interaction Networks to learn predictions from observing the real world.

Learning multimodal prediction

Collecting and using multimodal data is a key part of the project. That can include video, sound, accelerometer data, etc. An interesting problem is using such data to predict future events inter-modally. An example is predicting hearing a cracking sound when one sees a glass falling off a table. Recent developments in deep learning have applied such methods to predicting sound from images to automatically generate sound for video. This project will explore this and other inter-modal prediction opportunities, especially with regards to their utility for our two application targets of robotics and music.

Avoiding accidents by predicting the future

Through Deep Learning it has recently become possible to train neural networks to learn to predict future frames of a video stream, e.g. from dashboard cameras in cars. An exciting opportunity is to use such predictions to inform and improve the control of robots and self-driving vehicles. For instance, if predicting a collision is likely to occur, a car could take evasive action. An obvious challenge is the limited amount of training data: Collisions may not be frequent enough to properly train a detection algorithm. One way to solve this may be gather training data from a simulation of crashing vehicles. This and other challenges related to predicting future collisions will be the focus of this project.

Background: Programming and machine learning.

Recommended courses: INF4490 - Biologically inspired computingINF5860 - Machine Learning for Image Analysis

More information about learning and prediction projects: Postdoc Kai Olav Ellefsen and Jim Tørresen

Master projects in Interactive Music

In interactive music applications, listeners can create or modify music using sensors on their body or in their mobile devices, new musical controllers, or predictive machine learning systems. These systems have great potential to help novice musicians to create music, to create "AI-musicians" for simulating ensemble performances, or for remixing and modifying music we listen to through systems such as Spotify. In these projects, you will explore machine learning, creative computing, and the intersection between arts and technology.

Predictive Smartphone Music with MicroJam, iOS, and Android

We have recently released an iOS app, MicroJam, for users to make music in a social-media interface. This app is designed to connect users who are separated in time and space to help them create ensemble performances through their phone. We're looking for students to help implement new predictive machine learning algorithms in this app, either in Android or iOS. In this project, your job will be to apply machine learning techniques to our corpus of touch-screen performance data, and incorporate TensorFlow or Keras models into a mobile app for public distribution. Your work will be in data science (using Python) as well as in a cutting-edge language such as Swift or Kotlin.

Automatic music generation with evolutionary algorithms

Multiple performances overlaid in MicroJam.

Evolutionary algorithms are known to generate creative solutions, often very different from what human engineers or artists would produce. When generating art with evolutionary algorithms an obvious problem is how to design the objective function: How does one quantify that one artwork is better than another? Crowdsourcing evaluations from online users is one interesting technique, which has been applied to evolved pictures, sculptures and music.

Our iOS app MicroJam offers a promising platform for crowdsourcing evaluations and evolving short musical compositions. This project is suitable for students who would like to do some app development, and explore research questions related to how to represent and evolve short musical compositions.

Embedded Predictive Musical Instruments

Embedded computing platforms such as Arduino, Raspberry Pi, or Beaglebone and Bela can be used to make self-contained digital musical instruments. These systems turn data from various sensors (motion, touch, light, climate, etc) into sounds. Your job will be to implement predictive machine learning algorithms in an embedded sound-making device. This project will prepare you to combine data science with hardware and interaction design to create systems that users would want to play with everyday!

Deep Learning for Musical Composition:

Algorithmic composition (or generative music) involves using a computer program to write works of music. Recent advances in Neural Networks have already found applications in visual arts (e.g., style transfer) and new efforts in music are starting to appear (Project Magenta and our own Neural Touch-Screen Ensemble). Your job with this project will be to build an interactive machine learning system to compose music or sound. This will include implementing and comparing machine learning algorithms, collecting and managing a training corpus of musical data, and testing a system with new users!

More information about music related projects: Postdoc Charles Martin and Jim Tørresen

Background: Programming, musical interest/skills, machine learning

Recommended courses: INF5261 - Development of mobile information systems and services, INF4490 - Biologically inspired computingMUS2006 - Musikk og bevegelse, MUS4218 - Kognitiv musikkvitenskap

Master projects in Robot Sensing, Modelling and Prediction

A number of robotics master projects are available related to the EPEC project:

  • Design a robot sensing platform consisting of different kind of sensors which can effectively collect multimodal sensor data in parallel. Explore and compare algorithms for feature extraction and classification of data provided by the sensors for selected targeted actions of other robots or a human.
  • Develop a predictive model of human behaviour as seen from a robot. The accuracy should be tested for a variety of different human behaviours (varying complexity and speed). This project partly complements with the previous project and in applying the developed sensing platform, or alternatively, a marker based motion capture system can be used.
  • Develop a precise collision avoidance model for objects (for example robot arm) using 3D point cloud data. There are many ways to model this, and has to be simple enough to allow fast processing.
  • Develop a predictive model of coordinated human and robot motions and actions. Experiments with coordinated actions between humans and robots should be undertaken to demonstrate the improvement in the behaviour of the robot.

Background: Programming, knowledge in robotics (eg. INF3480) and machine learning (e.g. INF4490)

More information about robotics related projects: Kyrre Glette and Jim Tørresen

Benefits of taking master project related to an externally funded research project:

  • close collaboration with doctoral students (PhD) and postdoctoral researchers working on related topics
  • strong focus on progressing international state-of-the-art research and publication in international journals and conferences (beneficial for later application to PhD or researcher positions)
  • funding available for publishing results available at international conferences

Competences relevant for jobs in industry: programming of embedded systems, system modelling, prototyping of sensor/mechanical systems, smartphone app development, experience from an upcoming application area (service robots, interactive musical systems, mobile development, machine learning)

 

Emneord: robotics, music, embedded systems, artificial intelligence, machine learning
Publisert 26. sep. 2016 14:58 - Sist endret 24. sep. 2017 21:49