Nav

Touchology, an interactive sound table

Touchology is an explanatory investigation into a new wave of music where instruments and interfaces are designed around gestural controls. Carolyn Tam, Elena Falomo, Joe Muller, and Yang Gao aimed to create a new dream machine that would mesmerise the user, soothing him with molecules of sound and movement.

They regarded randomisable noise as a rising form of music and they wanted to turn it into a new form of expression. The white noise machine is controlled using a series of capacitive proximity sensors. This allows it translate subtle hand movements into digital controls for the antennae.

The random movement of the antennae creates a white noise soundscape, which the user can affect with proximity. When the user taps either the spreader plate or the wire of the antennae, they immediately start to move and bounce.

A practical focus included the design and manufacturing process of a flexible display structure, antennae sensors and robotic operators. A parallel technological focus considered the electronic and coding, attempting to track hand movement with the capacitive proximity sensor and use them to control the work without physical contact.

The final antennae designs were cast from resin with the wires embedded. The negative mould is made from silicon to produce accurate paths.

The table top was CNC machined out of 18mm Birch Ply. This gives a very clean, sharp look to the surface and allowed for an easy and accurate placing of the holes, and counterboring.

The spreader plate was turned and milled out of aluminium. The legs were comprised of steel box section with clamping sections welded to the top.

Touchology was made with open hardware. It is, in fact, controlled by a series of Arduinos and a Bare Conductive Touch Board. This allows it to sense both the user’s touch and proximity, and react accordingly.

Beginning stationary and fully splayed out, when the user touches either the central plate or the wire of the antennae, they immediately start to move and bounce. As the user’s hand approaches the moving groups, the antennae shrink away, retreating from the table top as the hand approaches.

Each Arduino was connected to the central Touch Board via I2C. The Arduinos receive a signal from the Touch Board according to the inputs to the board’s capacitive pins and reacted accordingly.

In terms of software architecture, the Touch Board acted as a master and all the Arduinos act as slaves. The Touch Board is broadcasting a string with the movement instructions to all the slaves via I2C. Each slave then locates its own information in the string accordingly to its electrodes’ index and reacts accordingly.

The user can tap the groups of antennae to play them by initiating their movement. As the user’s hand approaches moving groups, the antennae shrink away, retreating back into the table, falling into silence.

This way, the user can create his/her own soundscape according to taste, modulating the intensity of the instrument, creating new rhythms, producing harmony by synchronising the modules or introducing more randomness.

If you’d like to see your project on our blog, email us here: info@bareconductive.com

Don’t forget to share your photos with us on Twitter and Instagram using #BareConductive!