Brown CS News

Named UTRA supports CS-Music Collaboration

2008-0806.UTRA.png

By Jocelyn Adams and Ian Sherman

In the spring of 2007, a team composed of Jocelyn Adams, a composer/musician and music concentrator, and Ian Sherman, a musician and computer science concentrator, received a named UTRA earmarked for research in media and production. They pursued their project entitled, “Wearable sensor networks as an interface to interactive media,” under the guidance of Professor Ugur Cetintemel.

As aspiring musicians and scientists, Ian and Jocelyn were drawn to this project as a chance to bridge their interests in art and technology. Interactive media is an active research area at Brown in the electronic music and modern culture and media departments, as well as in the RISD digital media program. Ian and Jocelyn began their work with visits to these departments, as well as a fi eld trip to the MIT media lab. Since they were newcomers to this world, their initial goal was to learn its landscape.

Ian and Jocelyn first set their sights on developing software that could act as an abstraction over wireless hardware for artists. They quickly found, however, that a good solution to this problem already exists in the programming environment Max/MSP/Jitter. Max is a graphical data-flow language, which, when combined with packages to support audio synthesis (MSP) and video processing (Jitter), offers the artist a powerful tool for manipulating data streams and mapping them to audiovisual output. Its active user community supports objects to interface with sensors of all shapes and sizes. Following the advice of professor Chad Jenkins, who has had success using Nintendo Wiimotes to control Sony AIBO robots, they focused their attention on these familiar and inexpensive wireless controllers.

After growing comfortable with Max and successfully interfacing with the sensors, Ian and Jocelyn set to work implementing some algorithms to interpret the data streams coming in from the accelerometers on the Wiimotes. For example, they hoped to identify, in real time, when two Wiimotes were moving in sync. They found that some previously developed variants of the dynamic time-warping algorithm worked quite well. Though they didn’t have time to realize an artistic piece on any large scale, they were able to create some proof-of-concept pieces. In one piece, two users navigate around an aural landscape with their Wiimotes. When their movements are synchronized, they can move more quickly through this 3D environment populated with sounds.

Since the summer, Jocelyn has incorporated this work into performances with her band The Low Anthem, and Ian has been building more interactive media in a class in RISD’s digital media department. They are grateful to Ugur, Chad, the computer science department, the UTRA program, and NSF for their support in this project.

Next entry

Previous entry