Abstract
In this paper, a method is outlined for the sonification of
experimentally-observed Brownian motion organized into
optical structures. Sounds were modeled after the tracked,
three-dimensional motion of Brownian microspheres confined in the potential wells of a standing-wave laser trap.
Stochastic compositions based on freely-diffusing Brownian particles are limited by the indeterminacy of the data
range and by constraints on the data size and dimensions.
In this study, these limitations are overcome by using an
optical trap to restrict the random motion to an ordered
stack of two-dimensional regions of interest. It is argued
that the confinement of the particles in the optical lattice
provides an artistically appealing geometric landscape for
constructing digital audio effects and musical compositions based on experimental Brownian motion. A discussion of future work on data mapping and computational
modeling is included. The present study finds relevance in
the fields of stochastic music and sound design.
Note about Data Mapping
I sonified the position data of single, optically-trapped Brownian microspheres
using Pure Data and Ableton Live. Each radial displacement of the sphere was mapped to an audible sound.
I used four different mapping schemes—equal-area, biased, probability, and timbre mapping. In each scheme, the
centermost subregion of the mapping region corresponded to a single pure tone. In equal-area mapping, the
note increased along a particular musical scale as the particle moved out radially into different equal-area
subregions. In biased mapping, the centermost subregion area was enlarged relative to the other subregions. In
probability mapping, the note had a 50-50 percent probability of increasing by some integer number of midi note
numbers or decreasing by some other integer number of midi note numbers each time the particle entered a new
subregion. In timbre mapping, the spectral flux of a triangle wave was controlled by the stochastic motion of the trapped particle.
To visualize the equal-area mapping, I made a video
that synchronizes the sounds to the experimental motion (see link below).
Acknowledgements
The author would like to thank Keith Bonin for guiding
the experimental work that inspired this study; Research
Corporation and Wake Forest University for funding the
experiments; and David Busath, Justin Peatross, Steve
Ricks, Christian Asplund, Rodrigo Cadiz, Kurt Werner,
Nick Sibicky, Adam Brooks, and Katherine McKell for
discussing ideas and offering feedback about the project.
Resources
Citation
McKell, Chad. Sonification of optically-ordered Brownian motion, Proceedings of the International Computer Music Conference, Utrecht, Netherlands, September 2016.