Dr. Squiggles - RITMO
Dr. Squiggles is an interactive musical robot that listens to rhythms and plays along by tapping on the table.
Figure 1: Two happy Dr. Squiggles robots, designed and built at ROBIN.
The robot interaction lab at UiO has been developing the musical robots shown in Figure 1. The robot’s key components are
- solenoids that the robot uses to tap rhythms on a table,
- a contact microphone it uses to listen,
- a low-resolution LED display (the eye),
- embedded Raspberry Pi computer, and
- a C-language software framework for DSP, beat-tracking, generating rhythms, and similar.
In this project, you will use the robot to investigate human-robot or robot-robot interaction in the context of music. Beyond that, the project is open to a wide variety of possible avenues of exploration, and the student is welcome to tailor the project to their own skills and interests. Some examples of possible projects are as follows, and prospective students may choose from these or propose something else.
1.1 Human-robot games
Develop games for a person to play with the robot. For instance, develop a call-and-response game in which the robot plays rhythmic fragments, and the person has to repeat them. The robot would then have to check that the person played the rhythm correctly. Over time, the robot could tailor the complexity and speed of the rhythms to the skill of the player. The game could be pedagogical in nature, e.g. teaching how to play a particular rhythmic figure in different contexts, halving the tempo on a few failed attempts until the player can accurately reproduce the rhythm, etc.
1.2 Robot ensembles
Given a swarm of several autonomous robots that are listening to each other, develop an algorithm that each robot can use to generate rhythms, which gives rise to interesting music in the swarm as a whole. How can the robots collaborate in such a way that solos, duets, and tutti spontaneously arise in the music? How should a robot know when it should not play anything? How can new musical ideas be introduced and developed and propagated through the swarm? One idea would be to use something like a bargaining algorithm, where there are a limited number of notes available be played on any beat, and the robots have to negotiate with one another for the privilege of being the one who plays some of those notes.
Humans purposefully imbue musical performances with timing, i.e. they play notes earlier or later than strictly written. This makes it difficult for robots to interpret what humans play. Quantization is the process of extracting a symbolic rhythm (as it would be notated) from a performance that is imbued with timing. A project could explore existing quantization algorithms, and, in particular, implement quantization in the robot and show that it makes people think the robot is easier, better, or more fun to interact with.
Investigate how the robot’s eye can be used to convey information to a person while playing music together. There is a small existing literature on how humans use nonverbal communication when playing together. How can that literature be applied to human-robot interaction?
2 Methods and outcomes
The methods will be determined by the exact project chosen by the student, but will likely include some combination of 1) developing some kind of software system, 2) using numerical methods to test the system, and / or 3) performing a user study or otherwise testing the system with people. Ideally the student would develop the system directly on one of the existing robots, but alternatively the student could simulate the robot in the environment of their choice, e.g. Max/MSP, with the understanding that the finished system could be ported to the robots at a later date.
The skills that the student will need will depend on the project, but will likely involve working with digital audio, computer programming, conducting studies with human subjects, and writing findings clearly and concisely.
Students working on this will be part of the larger Dr. Squiggles project, which may include researchers at RITMO and other Masters students in the MCT Program. This could be a good opportunity for students that are interested in sharing ideas and methods across disciplines. Be aware, however, that each student is ultimately responsible for their own separate thesis.
More information about the project is on the Dr Squiggles Project Page:
Dr. Squiggles Project Page
 G. Hoffman and G. Weinberg. Interactive improvisation with a robotic marimba player. Autonomous Robots, 31(2-3):133–153, 2011.
 Y. Pan, M.-G. Kim, and K. Suzuki. A robot musician interacting with a human partner through initiative exchange. In NIME, pages 166–169, 2010.