Graduate Critique

<p>Grad Crit – Spring 2017</p>

Stop Motion – Reclamation

My original project proposal was based on the thalamus, the section of the brain that processes sensory input. I was speculating on the ability to control this section of the brain to expose unconscious bias or to encourage more thoughtful reactions to environmental input. I hit a brick wall with this concept and needed to move on to at least produce something to show for class.

I turned to a different subject matter that I’ve been obsessed with lately: time-lapsed photography of plants. I think of nature as a highly intelligent hyperobject, moving at a completely different pace through time than humans which conceals their true agency in the world. I was particularly interested in slime mold because of it’s primordial ancestry, and how it shape shifts to navigate and feed. I felt like I could recreate this creeping intelligent motion with melted wax and stop motion video to convey a lengthy passage of time.

In the resulting video, I am exploring the possibility of our primordial ancestry reclaiming technology from it’s distant relationship with humankind. By consuming a piece of RAM hardware, the slime takes an evolutionary leap forward. Like it’s human counterparts, it can now both consume and produce in different ways. The result of the slime’s newfound agency is felt like a small shockwave across the forest floor. The networked layers of abstraction that exist in nature are brought to the foreground and multiple pathways for new forms of life are created.

Here are some inspirational links that I was thinking about when making the film:

 

Sleepcaster 2.0

Sleepcaster / 2.0

The second iteration of Sleepcaster was a deeper dive into the physical representation of both controlling a remote version of oneself while also offering the experience of being touched. I was also asked to mock up a product site to flesh out more details on this speculative product. I realized that until I started actually building out the product website (which is not my forte) that the pros and cons of the Sleepcaster’s capabilities became much easier to articulate. I am currently still working on the site, but the top of the site has a p5js animation I programmed to help convey the responsive cell technology that the special bed relies on. Click here to see the Sleepcaster product site.

I’m  not thrilled with the mechanical hand I built, the base of the hand is made from a cardboard cutout of my own hand but mechanically it doesnt function as “real” as I would like because the surrounding foam inhibits movement. If I have the time, I want to build a more robust model that can be controlled with a microchip and motor.

Sleepcaster 1.0

Sleepcaster / 1.0

The assignment was to develop a speculative design, so I started combing through emerging technologies that could be expanded on or relocated into a new context. I had come across MIT’s Tangible Media Group’s inFORM project before and felt this would be an interesting technology that may help inform our evolving definition of intimacy in the digital age.

My speculation is taking a inFORM’s reactive cell technology and applying it to an intimate space where the user can transmit and receive live signal representations of human forms. This speculation raises numerous questions in terms of intimacy, safety and personal privacy. Would this satisfy a lonely person’s desire for a comforting human touch? Would we withdraw even further from the physical marketplace of relationships and fulfill our needs with mechanistic reproductions of our most intimate moments?

My first iteration involved a rough physical depiction of a human form rising out of a soft mattress-like surface as well as technical drawings to help illustrate the concept in greater detail.

Project 1.2 – Live Performance

After playing my initial sample for class, I was encouraged to explore tactile sound design to foster a more analog-centric performance. John Cage’s Water Walk was provided as an example of experimental analog music. Keeping simplicity in mind, I started building instruments out of things lying around my apartment and the following instruments took form:

Analog Noise Makers
  • Activators
    • Wooden Spoon
    • Rubber Mallet
    • Large Brush
    • Whisk
    • Tennis Balls
    • Bike Pedal
  • Passive
    • Upside Down Djembe Drum
    • Bike Wheel Spokes
    • Exercise Ball
    • Metal pot
    • Water
    • Cymbal

Unfortunately we didn’t have time to play my new project setup during class, but these are the pictures of the planned performance. Instead of putting instruments at the audience’s direct disposal, I set out soft square pads that served as activators for people to step/sit on. When a person stands on a pad, it prompts me to play a specific analog instrument. While operating a nice mixing board for Ableton Live, I put all of the analog instruments within each reach of one sitting position. This, I believed, would help lower the level of tension caused by scrambling back and forth.  The installation space is entered through a small door, and lined with soft squishy pads on the floor. I was hoping that by stepping on the pads, the participants would easily be clued in on interacting with the square pads inside the space.

Project 1 – Live Performance

I shared one of my favorite projects so far from the Fall 2016 quarter and was asked to elaborate on it by creating a live performance. I focused hard on learning how to operate Ableton Live 9 and wanted to enable the audience to play electronic instruments along with me. I made every sound from scratch through Reaktor, trying to maintain enough ambience and negative space in the track for audience members to enter new sounds at any given moment.

Brief Description
Several tracks controlled by performer, several tracks available for activation throughout performance space. Risk element increased by controlling tracks live and allowing audience to participate. Risk decreased by controlling all instrument effects on mixer board and designing complimentary sounds for any timing structure.

Soundscape Qualities
* Open space, allowing for participation
* Meditative
* Complimentary
* Tracks sync no matter when they are activated live

Software
* Ableton Live
* Midi Wifi or Bluetooth Controller App
* Midi Designer Pro 2
* http://appcrawlr.com/ios/midi-designer-pro
* Midi Studio
* https://itunes.apple.com/us/app/midi-studio/id519720275?mt=8&ign-mpt=uo%3D4

Hardware / Equipment
* Laptop
* 4 Channel Speakers
* Mixer board or Pad board
* iPad(s)
* Nintendo Wii Controller(s)
* Custom-built midi controllers through Arduino?
* Real objects connected to contact mics

Cultural Event 1 – Solastalgia

Solastalgia – 3/31/17

Dateline: 3004 Larimer St.

Denver, CO

Jayne Butler / Molly Lofton / Noah Travis Phillips / Sarah Richter / Bart Rondet / Anna Winter

A mixed exhibition of EDP MFA students featuring virtual reality, digital, tactile and sonic arts.

I had a great time experiencing the wide breadth of work from our program’s graduate students at Solastalgia. I started with Bart’s HTC Vive virtual reality piece, which successfully conveyed a vast sense of space and constant shifts in environment. Molly’s digital art print from Cinema 4D was well lit and did a great job contrasting digitally inspired renderings with a natural mountainous landscape. I liked Anna’s combination of tactility with her video art, and thought that projection mapping the video directly onto the GAK may have made an interesting iteration of this project. Jayne’s jacket camera was well-conceived and raised great questions regarding rights to safety vs. rights to privacy. Noah’s audio experience was deeply immersive. A mask was worn while focusing on the soundscape and created a sense of ritualism which I picked up from the alien-like mantras. Sarah’s projection mapping was scaled intimately and drew me in for several minutes, watching the simple cube structure slowly shift in context. Overall this was a fun show and I was really inspired to dig into Unity over this Summer to start creating immersive environments in VR.