Cell Interactive

AgRP Improv: An Interactive Cover Project for Cell Magazine

A collaboration between an artist, composer, and neuroscientist created visual and sonic realizations from ultrasonic sound files for an interactive cover for CELL magazine based on the paper Functional Ontogeny of Hypothalamic Agrp Neurons in Neonatal Mouse Behaviors (Frist Author: Marcelo R. Zimmer, Corresponding Author: Marcelo O. Deitrich). 

  

Three Cover Proposals: Cover A, Cover B, and Cover C.

The project was born out of a series of conversations between Marcelo, Matthew, and Dana. How could resolution be brought to something that was outside of human hearing? What would this look like? These were some of the quetions that were addressed in the series of sonic and visual experiemts explored by the artists. 

Part I - Sound Improv

The sound composition, Agrp Improv, is a single-take recording, not unlike a DJ, of multiple analog synthesizers controlled only by frequency data derived from neonate vocalizations. From the ultrasonic recordings, the Dietrich Lab created a scatter plot from which we were able to isolate the discrete frequencies in the vocalizations. Through simple division by a factor of 400, we brought the frequencies down to the range of human hearing. Then we created a simple program in the object-oriented programming language Max/MSP to convert the frequency data from Hz to volts. Via a DC-coupled audio interface, this frequency data now in the form of 1V/Oct signal, I was able to control multiple analog synthesizers. The gestural shapes from the vocalizations could now be used to control voltage controlled oscillators (VCOs), voltage controlled filters (VCFs), voltage controlled amplifiers and mixers (VCAs), as well as trigger envelope generators (ADSR) and even tempo gates for drum synthesizers. The result, as can be seen from the performance video, may look like a bird’s nest of cables, but is an intuitive, creative performance environment for endless variation—all driven by frequency data from crying baby mice.

– Matthew Suttor

 
 

Part II - Visual Improv

The three visualizations question the invisible sonic spaces of the baby mice cries. Building from Matthew’s human-hearing-scaled Dietrich Lab data, I was able to further scale the frequencies of the individual screams into three dimensional curves. Using a MEL script in Maya, the processed curve frequencies were mapped onto XYZpoints (X as the Hz data, Y as linear time for animation purposes at 24 FPS, and Z as distance between each scream). The curves provided the base landscape for each of the three cover visualizations. A series of experiments were generated, such as lofting the space between each scream into a surface, generating particles to chase the curve, and lighting geometry animated along the curve path. All of these tests resulted in a biological pattern or movement that was explicitly related to the numbers found in the data, but represented as a distilled geometric visual system over time. As I watched the spatial dimensions of the criese merge into a dimensional form, cameras, lighting, and particles were used to investigate the parameters of what constitutes a sonic landscape. My approach was that of a landscape photographer, at first setting the camera in Maya to around 20mm. This is demonstrated in Covers B and C. For Cover A, i went with an 80mm focal length to work towards a more even hierarchy in the curves, with emphasis on the depth created by the lighting and particles, and considering how light and atmosphere would impact the space over time.

The graphic methods used to build these images were further extended and refined in animation. Taking Matthew’s composition files and performance with the synthesizer, I worked with the graphics to create a short film, AgRP Improv Visual Sixteen, to express the world of communication between baby mouse for its mother.

– Dana Karwas

Animation process explorations.

Image process tests. Curves, Lofted Curves, Architecture, Spectrum Lighting, Particles, X-Rays, Sonic Fields and Curve Landscape scenes. The images below were structurally created in Maya as renderings and further processed in Photoshop.

  

  

  

  

 

  

 

  

  

  

  

  

  

  

 

Join our mailing list

* indicates required