Visualizing Telematic Music Performance

Visualizing Telematic Music Performance

Visualizing Telematic Music Performance is an ongoing research project at the University of Michigan investigating musical communication and perception. The subtle physical gestures musicians use to make music together are often lost in attempts to perform telematic music through video chats. Will mechatronic displays controlled by musicians’ motion capture data allow for deeper and richer musical communication?

VTMP is led by Dr. Michael Gurevich, Dr. John Granzow, Professor Matt Albert, and Dr. Brent Gillespie. VTMP is a part of the Faculty Engineering/Arts Student Teams (FEAST) Program at the University of Michigan and is generously sponsored by ArtsEngine and the Arts Initiative at Michigan.

Gavin Ryan performing with LARS

Dec 3rd 2022

I have served as the team’s lead roboticist since May 2022. Building upon the ideas and mechanisms of the team’s prior robots, I designed and fabricated our current robot, LARS.

LARS features 3 Dynamixel servo motors and 6 SG90 micro servos. The structure of the robot is made with 3D printed parts, machined aluminum, and off-the-shelf hardware. The electronics consist of an Arduino Uno, Dynamixel Arduino Shield, PCA9685 servo driver, and nRF24L01 wireless transciever.

Me and LARS

Nov 7, 2022

Video Demos:

December 3rd, 2022 - The University of Michigan and the University of Viriginia host a synchronous telematic concert using a pair of matching robots.

Our collaborators for this exciting concert:

Here is a short video demonstration of the robot’s degrees of freedom and range of expression.

Prototyping And Design

The desk lamp was a clear inspiration for LARS. Not only does a lamp-like morphology carry strong anthropromorphic connotations in pop culture, but the spring-balanced nature of the lamp also has mechanical advantages.

The spring-balance system is how we get a quick and nimble response from small motors. Inspired directly by the classic spring-balanced desk lamp, the springs compensate for gravity so that the motors don’t have to.

Aluminum pieces in the robot were custom designed and CNC machined. Featured here is a pair of plates used in an early prototype’s giant ball bearing base.

A huge inspiration for the ‘mouth’ of LARS is Hypersonic Design’s “Diffusion Choir” - we used a vinyl cutter to accurately perforate paper with folding patterns similar to Diffusion Choir.

Parallel to the robot building process, we recorded musicians in motion capture suits. Their data was processed through Max/MSP to then control the robots.

Percussionist: Sui Lin Tam

Next we began controlling the robots using live motion capture data. We scaled each musician’s data differently to suit the way they moved when they played their instrument.

featuring Matias Viliplana in the mo-cap suit

The neck joint was made with a flexible 3D printed part that allowed for a more fluid and natural response than a typical series tilt-pan motor arrangement might.

The base actuator was built using the EMBiR Lab’s open source actuator design. The 7.5:1 gear ratio allows for a higher torque from our mid-sized motor.

Past & Future

Above accounts for my work on the project - here are additional links if you’d like to learn more about the project as a whole, learn about joining the team, or the project’s history: