Sponsored Research

The HP + Yale Curricular Technology Project examines the role of technology in assisting learning and teaching. The collaboration is the culmination of an ongoing relationship between HP and Yale that is now in its sixth year of sponsored research.

The HP + Yale Curricular Technology Project has supported the continued development of programs and research at the Center for Collaborative Arts and Media (CCAM). This includes CCAM’s Curriculum in Residence program, which demonstrates how technology can be used to improve, advance, or highlight curriculum that benefits students and learning. This program works with faculty from across the arts and sciences at Yale, and through it, CCAM collaboratively works with sixteen courses per year. 
The courses are selected based on their engagement with the technology available at CCAM, including AI computation, motion capture, interactive light and sound systems, data visualization, and motion graphics. The projects and courses vary significantly based on the ideas of the faculty who apply to the program. The list below highlights the active research that is coming out of the Curriculum in Residence program as part of the HP + Yale Curricular Technology Project.
Principal Investigator: Dana Karwas
Director, Yale Center for Collaborative Arts and Media (CCAM)
Critic, Yale School of Architecture


Can machines think?
I AM ALAN TURING is an OPERA project investigating Alan Turing’s legacy. Utilizing an AI trained specifically on Turing’s research to help craft the libretto, the opera embarks on a SPECULATIVE exploration to recreate Turing’s long-lost speaking VOICE. This full- length operatic work blends Turing’s scientific insights with state-of- the-art immersive technology. Through an INTERACTIVE USER INTERFACE (IUI) for the audience, the opera ultimately transforms into an expansive Turing Test, pushing the boundaries between human and machine intelligence. 
Project Director: Matthew Suttor

Experimental Instrument Ensemble

How can technological tools and methodologies interface with musical expression in a live performance context?
The Experimental Instrument Ensemble is an interdisciplinary working group and associated curricular initiative that brings together emergent technology, INSTRUMENT DESIGN, and music COMPOSITION, as well as LIVE PERFORMANCE. Juxtaposing old and new technologies, we are interested in exploring ideas of accessibility, ability, virtuosity, AUTOMATION, control, and gesture through the creation of novel instruments and works for multi-player collective ensemble. Our research includes investigations into sonic and haptic feedback mechanisms, SYNCHRONIZATION, HCI, AI, and MACHINE LEARNING, and methods of interactivity associated with VR, GAMING, and computer vision platforms. This research is hands-on, and we learn through building, experimentation, and the iterative design process, drawing on our backgrounds as engineers, researches, computer scientists, and musicians. 
Project Director: Konrad Kaczmarek

Ultra Space

How do we design for the future of off-planet life?
How can we evolve a new culture of off-planet life, and what are the ARTIFACTS that will support this culture? How will the histories, philosophies, and myths of looking to the COSMOS for the answers invert our relationship to space— once we get beyond the scientific and technological achievements needed to maintain human life there? And so, The Mechanical Artifact: Ultra Space is a course designed to engage all of these questions and connect architecture and design students to our unfolding sci-fi space future. The CCAM Ultra Space project combines coursework with research to build, test, and deploy a mechanism designed to function in ZERO GRAVITY. The designs are based on the M-Cubed system of mechanism, movement, and meaning—all of which need to be rethought for the environment of WEIGHTLESSNESS. Students design the artifacts in Rhino and test their designs in SIMULATION software such as Houdini, using HP Z8 FURY G5 workstations. The collection of artifacts are then printed on 3D printers or waterjet cutters and assembled for flight. 
Project Director: Dana Karwas


What are the extents of human perception?
At the core of this research is an interest in SUBVERTING the Vicon MOTION CAPTURE system, typically used in the entertainment and engineering industries, to become a collaborator in artistic workflows. Through this research, custom hardware and software have been created to augment the use cases of motion capture to create new methods of INTERACTIVE art that span experimental sound art, interactive projection design, augmented reality, and more. With this technology, data can be streamed and manipulated live between programs such as Ableton Live, MaxMSP, Unreal Engine, Touch designer, and more. Manipulations research looks at  individual and collective tracked body movements of other individuals or groups, props tracked to control the SONIFICATION of space, and the remapping of camera position in a live scene. 
Project Directors: Dana Karwas and Sarah Oppenheimer

Ornamenting Architecture

How do our brains respond to the architectural environment?
The ongoing research in the Ornamenting Architecture: Cosmos, Nature, Neuroaesthetics course at the Yale School of Architecture is pioneering a new, highly interdisciplinary approach to design pedagogy that integrates the emerging field of NEUROAESTHETICS—allowing us to examine our brains’ responses to ornament through the use of biometric tracking devices. Students analyze existing buildings’ ornament, as well as their own unique designs. To do so, they use EYE TRACKING headsets, EEG monitors, and analysis software that can quantify the effect that the presence (or absence) of ARCHITECTURAL ORNAMENT has on our areas of visual attention, gaze sequence, and brain activity. The students’ final project will be built and eye-tracked within an immersive environment on
 an HP Z8 Fury G5 workstation, allowing for comparison between multiple designs. The use of these BIOMETRIC technologies, live analysis software, and immersive simulation will enable novel forms of empirically grounded design. The course is using eye tracking software and hardware from Tobii.
Project Directors: Kassandra Semenov-Leiva Misha Semenov-Leiva 

Speculative Relics

Can materializations of irrevocably lost cultural matter, created through the use of emerging technologies, help us to better understand its significance?
From MISSING MANUSCRIPTS to LOST TREASURE, there is a world of valued objects that have VANISHED in part or in full, and now “exist” primarily in human memory. These are largely inaccessible to the physical and digital archives we increasingly engage with today. As a way to “recover” or REDISCOVER them, the project will engage AI language and image generators to create SPECULATIVE texts and images theorize their appearance. These will be TRANSLATED into designs and plans, and ultimately realized as 3D OBJECTS—using analog visual art practices and other methods of fabrication. The final works will be exhibited to inspire conversation around questions of AUTHENTICITY, OWNERSHIP, and the past and future of CULTURAL MEMORY. The first phase will enact a conjectural library of notable lost books and manuscripts from around the world, and continue with a collection of “found” relics. 
Project Director: Lauren Dubowski

The Collective Animal Behavior Project

What can music and image reveal to us in environmental data?
Collective Animal Behavior is an interdisciplinary science and art project that incorporates artificial intelligence and low-cost OPEN- SOURCE HARDWARE. It is used for wildlife and environmental research to generate musical expressions of animal behavior and their ECOSYSTEMS. Through emergent technology,the Collective Behavior project proposes new data VISUALIZATION techniques and exploratory assessment of data to create music to DEMOCRATIZE the data SONIFICATION of animal behavior to engage local communities. It uses HP Z8 Fury G5 workstation, AudioMoth full-spectrum acoustic loggers, PlantWave units, and Raspberry Pi hardware and software for teaching and music creation. A mobile system is being developed for community OUTREACH for use by local communities and farms. The mobile system is deployed through laptop computers and tablets. 
Project Directors: Matthew Suttor and Diego Ellis Soto

Teaching Creativity

Can Creativity be Taught?
The project explores strategy combined with pedagogy for big ideas: What is CREATIVE THINKING? Can creativity be taught? And why do we so often get stuck? To facilitate a longitudinal study of technology and creativity, the research is building an APPLIED EXPERIMENT to design, analyze, and replicate workflows in “creative thinking and making” in the MOTION CAPTURE studio at CCAM. The images here show Yale students and collaborators engaged in COMBINATORY PLAY, a game designed to encourage creativity, elaborating upon principles developed by ALBERT EINSTEIN. 
Project Directors: Elise Morrison and Matthew Suttor

Subscribe to the CCAM Newsletter

* indicates required