Art Code Music

Virtual Reality Music Visualization


Under the name “Echobit,” Brian Hansen and I have been performing immersive VJ sets and audio-visual experiments using the Oculus Rift. We apply audio feature extraction and MIR techniques in order to create rich, interactive visuals. Users are able to explore visual worlds while they react to musical material in real time. The visuals are also projected on the wall so that all audience members can all take part in the experience.

I believe it’s the first application of Virtual Reality technology as applied to VJing and music visualization in general.

It is built in OpenFrameworks using a custom system for generating audio-reactive geometry and GLSL shaders.





Art Code

Interactive Installation for SB Museum


I created an interactive touchscreen application for the Santa Barbara Museum of Art that enabled museum patrons to construct their own camera-less images (photograms) in the style of Laszlo Moholy-Nagy, then upload the results to social media. There were nearly a thousand finished submissions, which can be viewed in this Flickr album. Here are a few of the results:


It was very interesting trying to strike a balance between a simple and intuitive interface, while allowing enough depth for more subtle artistic expression (i.e. “low-floor, high-ceiling”). Based on how I observed people using it, it seems to have been relatively effective in this regard. Some users would create a photogram from start to finish in only 20 seconds, while other users would refine the placement of their objects for 15 minutes or more.



Art Code

Photogram Simulation for SBMA

Screen Shot 2015-06-04 at 11.39.51 PMHere are a few screenshots of an interactive app I’m installing for an upcoming exhibition in the Santa Barbara Museum of Art. The show is called The Paintings of Moholy-Nagy: The Shape of Things to Come. I’m trying to create a realistic simulation of a “photogram,” or an image made by placing objects directly over light sensitive paper. Users will use a touch-screen to place and rotate the various objects before “exposing” them to create the finished photogram. The results are starting to look more convincing, although there’s still a bit of work to perfect regarding layering and the simulated perspective shifts. Everything is done in Canvas/Javascript.

Screen Shot 2015-06-04 at 11.39.33 PMScreen Shot 2015-06-04 at 11.34.05 PM


Art Code Voice of Sisyphus

VOS going to Shanghai


Voice of Sisyphus will be shown at the Chronus Art Center in Shanghai starting in June. I developed the compositional software which controls the piece.


Delacroix Exhibition iPad App

I built an iPad app for the Santa Barbara Museum of Art. The exhibition was entitled “Delacroix and the Matter of Finish,” which ran from October through January 2014.  Here’s a link to the original exhibition.




Standing Waves: a Multimodal Composition in the AlloSphere

This was my thesis project. You can find my presentation slides here.

Standing Waves is an audio-visual installation designed for the AlloSphere, an immersive multimedia instrument being built at UCSB. The piece presents an interactive visualization of two-dimensional wave propagation projected in three dimensions around the surface of a sphere. This simulation is then sonified through a variation of additive synthesis and spectral decomposition, and the resulting audio is spatialized around the perimeter of the performance space.

Users are able to interact and control the combined audio-visual synthesizer through a motion-capture interface and gestural mapping system. The piece’s form is structured through a series of modules that can run while being guided by user input, as well as in a semi-autonomous “installation mode” when limited or no user interaction is detected.

Art Code

Harold Cohen’s Coloring Algorithm in C++

Harold Cohen’s coloring formula is a heuristic method by which to probabilistically control randomly generated color saturation, and lightness values.It has been used extensively by his AARON algorithmic painter, and is summarized in detail in the essay “Color, Simply,” 2006. I ported the algorithm into a C++ as a class which you can download here (relies on AlloCore currently. A standalone version is soon to come).
The algorithm can be described as follows:

  1. Three normalized number ranges are chosen, corresponding to low, medium, and high values. For example 0.15-0.35 (L), 0.4-0.65 (M), 0.8-1.0 (H)
  2. These are set up in a 3×3 matrix, each corresponding to a possible saturation-lightness pairing. For example, a low-low (LL) pairing would provide both saturation and lightness values chosen randomly from within the low range
  3. During initialization, a probability value is assigned to each of the 9 pairing possibilities inside of this matrix. Cohen suggests only using 2-3 of them per composition, for example: 20%-LL, 40%-HL, and 40%-MH
  4. When a new color is desired, one of these range pairs is selected based on its assigned probability, and then a specific saturation-lightness pair is chosen randomly from within each of the selected pair’s ranges.