SyncUp, Wearable Device for Remote Music Collaboration
December 31st, 2020
Our challenge was to design a remote collaboration tool. Long-distance work has exploded in the aftermath of the coronavirus pandemic. However, even after the pandemic, we predict that this remote work trend will persist as people and companies adjusted their lifestyles and working environments. With the situation and this timely topic, we decided to present a device that supports people collaborating with a distance.
- CMU Interaction Design Studio
- Cowork with | Hannah Kim, Karen Escarcha, Yu Chuan Shan and Advised by Dina El Zanfaly and Kyuha Shim
- Duration | 7 weeks (Fall 2020)
- Responsibility | design research, conceptual design, interaction design, 2D and 3D rendering, video editing
- Tools | Figma, Arduino, Adobe Premier Pro, After Effects, SketchUp, Rhino, Keyshot
Screen-oriented interaction has increased due to pandemic and it reduced non-verbal languages such as body language. Therefore, more pressure are required to people when they work together on online. To address this problem, our goals were to: (1) Research a remote collaboration to understand existing problems and design opportunities, (2) Design a multisensory experience that utilizes intermodality.
SyncUp is a wearable multisensory device to enhance nonverbal communication between a conductor and musicians in a remote setting. Utilizing eye-tracking and gestural data, the device translates the conductor’s directions into haptic and light feedback on a musician’s wrist.
Scenario 1 | Synchronous Remote Rehearsal
In an online music rehearsal, musicians are participating the ensemble with the wearable device notifying the conductor's tempo as well as when they have to cue in.
Scenario 2 | Rehearsal Review & Individual Practice
After the rehearsal, the conductor reviews and gives feedback to musicians based on the rehearsal recordings. With the feedback, the wearable device helps the musicians to practice individually.
With the given topic, we supposed that the performing arts fields are having the most difficulties in collaborating with the disappearance of nonverbal cues. Through interviews with the related areas of performance and music, we could confirm there were more difficulties than we expected. Among them, all the related industry practitioners, both directors and performers, emphasized "signs of the director" and "liveliness" are vital elements of the performance.
Embody the liveness of an in-person music performance and deliver a conductor’s vision in a remote rehearsal
From academic readings to video tutorials, we gained insight into how conductors communicate through their gestures. Unfortunately, the current remote environment hinders this nonverbal communication since rehearsal is often asynchronous and requires multiple pieces of technology.
In the case study below, the conductor recorded her gestures and created a click track for musicians to listen to which helps them stay on the same beat. Orchestra members then recorded themselves playing while juggling multiple visual and audio signals.
Remote Orchestra Case Study (Duke Today)
Our design concept features a tangible device that enhances nonverbal communication between the conductor and musicians in a remote rehearsal setting. We imagined a system that translates conductor’s gestures into light patterns through a device in the musician’s space. The use of light is partly inspired by the Maluma-Takete Effect which is the mapping between visual shapes and musical sounds.
The following artifacts explore the interactions between a conductor (Ava) and a musician (Darien) who are remotely rehearsing for an upcoming performance using our light device. We designed the device to communicate the tempo the conductor is setting as well as dynamic gestures such as play louder or softer (crescendo/decrescendo). The device also serves as a helpful tool when the musician is practicing by themselves.
We explored diverse forms for the device such as a wall installation, a music stand mount, and a table light. We leaned towards integrating it with the music stand since that is where the musician’s attention will primarily be. We explored form through sketching, paper prototyping, and 3D modeling.
To prototype the light, we used a Leap Motion Controller connected to an Arduino. We created a low fidelity prototype to recognize up and down, left to right gestural inputs which then correlate to brightness/intensity and color change. We decided that crescendo/decrescendo (an up/down conductor gesture) would correlate to light intensity.
It was difficult to determine how easily musicians could perceive a conductor’s gestures through light
We did another round of user research to test out our concept. We interviewed 5 musicians and 2 conductors (including the musician/conductor duo from the case study we reviewed).
We discovered that light would not be appropriate because it would require musicians to focus on another visual cue / learn a new language.
Even though we were two weeks away from the project deadline, we prioritzed listening to our users and pivoted by (1) exploring haptic feedback as output instead of light and (2) focusing on tempo and cuing in as main forms of gestural input (as opposed to stylistic elements such as crescendo/decrescendo).
Fortunately, there are a few case studies that support this new direction such as the Haptic Baton developed for blind musicians and the Pulse metronome watch. Because of these existing technologies, we felt confident in switching from light to haptic feedback.
Ava and her ensemble, which includes violinist Darien, are rehearsing together online for an upcoming live performance. Darien relies on his wearable device to let him know the tempo Ava is setting as well as when Ava is cuing him to start playing.
Ava reviews the rehearsal recording and gives asynchronous feedback to musicians on how they can improve. Darien relies on the wearable device to help him practice according to Ava’s feedback.
The wearable device wraps around the musician’s wrist and provides haptic and light feedback to help the musician understand the conductor
The wrist is an ideal spot for a wearable device because this area has a fast reaction time to visual feedback and is an appropriate sensitive location on the body to deliver vibro-tactile feedback. The use of haptic feedback to provide information on tempo is a reliable alternative to an auditory cue (click track) which would lessen auditory cognitive load for musicians. The use of light is intentional as both music and color map to aspects of emotion. For example, loud, high energy sounds match with bright, vivid yellow hues while laid-back, quiet sounds match with muted, cool colors like blue.
We created physical prototypes and 3D renders to convey the form design (shape, size, materiality, etc)
We decided to create a physical prototype with Low-Temperature Thermoplastics which is mealliable when heated and hardens when it cools. The 3D rendered images communicate how users can form the device to their wrist to ensure a snug fit. We imagine using thermoplastic polyurethane for the external material of the device. The material is similar to a silicone phone case that is long lasting and flexible, but still retains its shape.
Multisensory Experiences
We appreciate this opportunity designing non-screen based interactions such as gestural input and haptic feedback. It presented new ways of utilizing our senses within our built environments.
Speculative vs. Viable
Our original concepts leaned towards the speculative side. Fortunately, hearing directly from musicians helped us develop our concept’s viability while still exploring novel interactions.
Design the User Interface
We prioritized designing the tangible interface and unfortunately did not have time to develop the UI and brand within the short timeframe. We need to do this so we can better showcase how we might incorporate eye-tracking and asynchronous feedback.