Category Archives: Art

Art Code Interview Music

Talk @ SPARKS | New Media Architectures: Vancouver

Art Code Music

Heron’s Soundscape

Exhibited at SIGGRAPH 2025 | New Media Architectures: Vancouver

Heron’s Soundscape is a generative musical composition designed as an interactive augmented reality (AR) experience, to be played by participants using mobile devices and the “Heron’s Dreamscape” mural as a visual score. Sound and visual accompaniment is created in real-time through participants’ direct interactions with the mural.

Audience members create one or more “performances” of the piece determined by their choices and interactions, including the angles, locations, and paths they trace visually with their mobile devices while hearing the results of each engagement in relation to the images presented upon their device’s viewfinder.

Check it out if you’re in Downtown Vancouver:
https://amusesmile.github.io/HeronsSoundscape

A revisiting of the New Media Architectures: Vancouver exhibition at SIGGRAPH 2025 which featured augmented reality (AR) works by eight artists in dialogue with “Heron’s Dreamscape”, a vibrant public mural by artist Priscilla Yu. Participating artists included: Jiwon Ham & Ana María Cárdenas, Joshua Dickinson, Sahar Sajadieh & Manaswi Mishra, Mike Rader, Darya Ramezani & Gene Anthony Santiago-Holt, and Priscilla Yu. Curated by: Miriam Esquitín, Johannes DeYoung, and Gustavo Alfonso Rincon. The exhibition reimagined the role of public art through digital augmentation, activating the mural as a living interface between place, community, and technology. Our discussion will delve into the collaborative process behind the exhibition and the evolving relationship between physical murals and digital interventions. Together, we’ll explore how site-specific digital media can expand the narrative capacity of public artworks, deepen community engagement, and reframe our experience of urban environments. Through an interdisciplinary lens, the conversation will address the potentials and challenges of blending artistic traditions with emerging technologies — and what it means to co-author public space in the digital age. All audience members will gain insight into the artistic, curatorial, and technical approaches and visions that shaped the exhibition, while reflecting on the broader cultural impact of art in augmented urban landscapes. This SPARKS event will feature artists Joshua Dickinson, Jiwon Ham, Ana María Cárdenas, Mike Rader, and Sahar Sajadieh in dialogue with curators Miriam Equitín, Gustavo Rincon, and Johannes DeYoung. Acknowledgements: This exhibition is organized in partnership with ACM SIGGRAPH, ACM SIGGRAPH DAC standing committee, Bentall Centre, and Downtown Van.

Art Code

Selected Algorithmic Drawing Experiments

For the past many years I’ve tinkered with various algorithms to take a color photograph and convert it into a line drawing. Unlike other “style transfer” or pencil effects, these are actually made up of individual strokes and can therefore be drawn with a pen plotter or similar device. Some of the more advanced versions even contain simple logic for hatching and cross-hatching in order to produce shading in a technique similar to a real human artist. Thus far, I’ve only used heuristic computer vision approaches, however I hope to one day make use of a bit of machine learning to improve the results aesthetically.

drawingAlgos2

kim

 

Art Interview

Interview with Ted’s Little Dream

For our ongoing series of interviews through Static Attraction, we talk art and design with Ted Chin, better known by his handle “Ted’s Little Dream.” He was chosen as the 2021 Photoshop splash screen cover artist so it was extremely fun hearing his story about how that happened, as well as his thoughts about the creative community, copyright, and NFTs.

Art Code Music

Virtual Reality Music Visualization

13389153_10153545417151078_2109466568_o

Under the name “Echobit,” Brian Hansen and I have been performing immersive VJ sets and audio-visual experiments using the Oculus Rift. We apply audio feature extraction and MIR techniques in order to create rich, interactive visuals. Users are able to explore visual worlds while they react to musical material in real time. The visuals are also projected on the wall so that all audience members can all take part in the experience.

I believe it’s the first application of Virtual Reality technology as applied to VJing and music visualization in general.

It is built in OpenFrameworks using a custom system for generating audio-reactive geometry and GLSL shaders.

13405733_10153545417126078_329967991_o

13383837_10153545417091078_1576806256_o

13410685_10153545454681078_2133745267_o

13396857_10153545417206078_607676417_o

Art Code

Interactive Installation for SB Museum

DSCF1428_straigt

I created an interactive touchscreen application for the Santa Barbara Museum of Art that enabled museum patrons to construct their own camera-less images (photograms) in the style of Laszlo Moholy-Nagy, then upload the results to social media. There were nearly a thousand finished submissions, which can be viewed in this Flickr album. Here are a few of the results:

picasion.com_cb087fc5652f729f41118d9ed4145894

It was very interesting trying to strike a balance between a simple and intuitive interface, while allowing enough depth for more subtle artistic expression (i.e. “low-floor, high-ceiling”). Based on how I observed people using it, it seems to have been relatively effective in this regard. Some users would create a photogram from start to finish in only 20 seconds, while other users would refine the placement of their objects for 15 minutes or more.

DSCF1435

DSCF1296sbma3

Art Code

Photogram Simulation for SBMA

Screen Shot 2015-06-04 at 11.39.51 PMHere are a few screenshots of an interactive app I’m installing for an upcoming exhibition in the Santa Barbara Museum of Art. The show is called The Paintings of Moholy-Nagy: The Shape of Things to Come. I’m trying to create a realistic simulation of a “photogram,” or an image made by placing objects directly over light sensitive paper. Users will use a touch-screen to place and rotate the various objects before “exposing” them to create the finished photogram. The results are starting to look more convincing, although there’s still a bit of work to perfect regarding layering and the simulated perspective shifts. Everything is done in Canvas/Javascript.

Screen Shot 2015-06-04 at 11.39.33 PMScreen Shot 2015-06-04 at 11.34.05 PM