top of page

Painting Sound

‘Painting Sound’ sought to develop more intuitive and immediate ways of interaction with sound design techniques to expand the capabilities of modern music production methods and free us from traditional western music’s paradigms of HCI.

 

This was my individual final project as part of my MSc. The project was carried out during the peak of the COVID pandemic and as such limited resources and support were available. An app capable of touchscreen data and sending it using the OSC protocol was initially developed using Unity before being replaced by an android app made by a fellow student.

Idea

The initial idea stemmed from the desire to mimic the ease of accessibility and on-the-move creativity provided by the likes of an artist’s sketch pad or writer’s notepad. The intention was to focus on using technology commonly found in our pockets such as smartphones as well as exploring the design of new virtual instruments for more gesturally rich methods of sonic expression.

Method

The programming language Max/MSP was used for rapid prototyping using OSC data sent from a smartphone in order to develop a set of sound design tools. The main types of inputs used with the smartphone were position, accelerometer data and an approximation of pressure using touch radius. These tools were adapted for use with data from Ultraleap’s Leap Motion gesture tracking device before the results of both methods were used for comparison. The Leap Motion was chosen because of its richness of extractable data and ability to easily interpret hand movements over a relatively wide area.

Takeaways

Feedback from testing remarked on the ease of exploration that came from using the Leap Motion and the natural tendency to enter a state of 'flow'. The limitations of available phone technology made applications of gesture difficult to interpret in creatively meaningful or instinctive ways. The more comprehensive gestural tracking methods allowed for much better results and far more intuitive and performative control. Some limitations were found in the second method however, most notably the lack of physical boundaries or haptic feedback.

 

Further development could focus on using more recent phone technology such as LiDAR data or other gesture tracking information as well as exploring visualisation techniques for providing feedback to the user. Machine learning models could also be built to recognise hand gestures, shapes and positions for use in producing signals and exploring the idea of mimicking a traditional conductor.

Demos

Final Piece - "Escaping Paradigms"

To demonstrate the project I composed a short piece. The tools were used to create sound design and musical elements which were arranged in a DAW. Effects parameter change were then performed in using the max4live tool shown in the previous video.

​

The piece is inspired by the idea of escaping paradigms, moving away from traditional methods of composing and performing music in favour of new forms of sonic exploration.

bottom of page