Here are some examples of projects that were created in the 2020 Designing Physical Interactions for Music workshop.

If you made something in the workshop and would like to share the link, please get in touch with me and I will make sure to post it here.

Anika Fuloria - TILT-A-PLAY

The TILT-A-PLAY is a device which can add effects and filters to instruments using rotational gestures from a hand. Any instrument and any rotational (non-toggle) effect or filter can be used. With an accelerometer, the TILT-A-PLAY determines the orientation (pitch, yaw, roll) of the hand which is then used as a parameter to control the sound. For example, the pitch of the hand could change the amount of wet mix in the reverb. This project uses Teensyduino, Max MSP, and Logic Pro/GarageBand.

Johannes Regnier - 2 projects

Various nature-inspired and dynamical systems algorithms. + a music tracker, directly imported from the 90s.
Both projects based on LCD and touchscreen.”

Dynamical Systems 1 from Johannes Regnier on Vimeo.

Project 1: swarm algorithm, spring-mass system, bouncing ball generator,
orbital dynamics. Data sent as OSC packets to MAX or Pd, for instance.
Control via touchscreen, pots, and buttons.

Micro Tracker from Johannes Regnier on Vimeo.

Project 2: 8-track music tracker, similar to the trackers from the 90s,
with additional functions: note probabilities, randomization etc…
Control via touchscreen, pots, and buttons.
Midi over USB. Main functions run as an interrupt, for precise timing.”

Giles Bowkett - Stochastic Drum Machine

Drum machine which you can program like a normal drum machine, or as a map of probabilities.

Kevin Casey Simon - Strum Gesture Tests

Proof of Concept: https://youtu.be/aCkG5XMUJuc Project Overview: https://youtu.be/eJpEHv56NKk Phase 3: https://youtu.be/37gAaEV4bQE https://youtu.be/bLUMifuqkgg https://youtu.be/sX_2kl56UJc Final: https://youtu.be/AzsRcjreC6o https://youtu.be/acNURZoocvU https://youtu.be/jweWoB75JWw
A strum gesture based MIDI controller was built. The project evolved in 4 phases which are shown in the posted videos. The first phase used small buttons which were difficult to play but demonstrated proof of concept. The buttons were used for chord selection and articulation (down stroke, up stroke, mute, silence) and mode change which enabled different strumming and picking patterns.. The second phase replaced the small buttons with responsive arcade buttons. Phase three added dynamics with Piezo sensors. The final phase (for now) used force sensing resistors including two in a linear format which tracked down stroke and up stroke strumming.

Rick Snow - Turntable Reconstruction

This is a work-in-progress demonstration of a turntable inspired, semi-autonomous, robotic instrument. 6 knobs control parameters related to the location, movement, and speed of two contact microphones raised and lowered above a rotating acryllic disc.

Kevin Haywood - Ribbon Control of Audio Synthesis

Ribbon Control of Audio Synthesis from Kevin Haywood on Vimeo.

Ribbon interaction (no audio) from Kevin Haywood on Vimeo.

I’m using an LED light strip to visualize a simple physical model which is running on a Teensy microcontroller, with one ribbon controller, two potentiometers, and two pushbuttons as input. The real-time data from this system is used for simultaneous control of audio synthesis.

The system models two rotating discs, which are depicted as groups of blue and red lights along the single dimension of the LED strip. The blue disc is smaller in diameter, and appears as if positioned in front of the red one, occluding it. The ribbon controller is situated adjacent to and in parallel with the LED strip, and tracks the position and lateral velocity of any single touch. In conjunction with the two pushbuttons, the ribbon can be used to alternately rotate or translate either disc. Each disc has an independently definable coefficient of friction which can be varied in real-time through the use of the potentiometers, where a zero friction state allows for any applied motions to cycle indefinitely.”

Shashaank N - aural360: Audio Spatialization Mobile App

aural360 is a mobile application that can spatialize audio without VR/AR gears or SDKs. It uses the built-in magnetometer and accelerometer sensors in smartphones and tablets to control the 360-degree audio panning and 3D position. The application was written in Xamarin, a cross platform development software, and the audio was spatialized using the Facebook Audio 360 and AVFoundation SDKs.

Franz Danksagmüller - wearable MIDI controller (prototype)


https://vimeo.com/444544234/ecd3e1ae52

Work in progress demo of a wearable MIDI controller: two flex sensors, one triple axis accelerometer. Midi to Kyma. flex sensor on finger controls pitch.

x axis controls filter cutoff
y axis controls octave
z accel. triggers amp and filter envelope
flex sensor on elbow controls delay feedback