All posts by stephenheymann

Project 1 – Live Performance

I shared one of my favorite projects so far from the Fall 2016 quarter and was asked to elaborate on it by creating a live performance. I focused hard on learning how to operate Ableton Live 9 and wanted to enable the audience to play electronic instruments along with me. I made every sound from scratch through Reaktor, trying to maintain enough ambience and negative space in the track for audience members to enter new sounds at any given moment.

Brief Description
Several tracks controlled by performer, several tracks available for activation throughout performance space. Risk element increased by controlling tracks live and allowing audience to participate. Risk decreased by controlling all instrument effects on mixer board and designing complimentary sounds for any timing structure.

Soundscape Qualities
* Open space, allowing for participation
* Meditative
* Complimentary
* Tracks sync no matter when they are activated live

Software
* Ableton Live
* Midi Wifi or Bluetooth Controller App
* Midi Designer Pro 2
* http://appcrawlr.com/ios/midi-designer-pro
* Midi Studio
* https://itunes.apple.com/us/app/midi-studio/id519720275?mt=8&ign-mpt=uo%3D4

Hardware / Equipment
* Laptop
* 4 Channel Speakers
* Mixer board or Pad board
* iPad(s)
* Nintendo Wii Controller(s)
* Custom-built midi controllers through Arduino?
* Real objects connected to contact mics

Taroko Gorge – Sample

Working on modifying a poetry generator started at MIT called Taroko Gorge. Here are a few sample lines that it has written on it’s own:

 

Cultures control the writing.
Truths ignore.
Sages seek the writings.
 
  ride the worldly destructive narrative story —
 
Sages grasp the writings.
Sages depart.
Sage controls the depth.
 
  avoid the impermanent loathsome destructive —

LED Light Programming

Last week we simulated programming LED lights in Processing. I have no prior experience in Processing, only p5js, so I’m still getting used to the syntax. I kept messing with cos and sin parameters to create different strobing effects with the lights.

Well I guess I still have work to do running a processing sketch on my blog, as the same method for p5js does not seem to be working. However I’m pretty happy with the variety of strobe effects that I got with 4 different variations.

Right now I’m reading up on my Adafruit Feather to try and get a better idea of its capabilities until we have access to them.

Cultural Event 1 – Solastalgia

Solastalgia – 3/31/17

Dateline: 3004 Larimer St.

Denver, CO

Jayne Butler / Molly Lofton / Noah Travis Phillips / Sarah Richter / Bart Rondet / Anna Winter

A mixed exhibition of EDP MFA students featuring virtual reality, digital, tactile and sonic arts.

I had a great time experiencing the wide breadth of work from our program’s graduate students at Solastalgia. I started with Bart’s HTC Vive virtual reality piece, which successfully conveyed a vast sense of space and constant shifts in environment. Molly’s digital art print from Cinema 4D was well lit and did a great job contrasting digitally inspired renderings with a natural mountainous landscape. I liked Anna’s combination of tactility with her video art, and thought that projection mapping the video directly onto the GAK may have made an interesting iteration of this project. Jayne’s jacket camera was well-conceived and raised great questions regarding rights to safety vs. rights to privacy. Noah’s audio experience was deeply immersive. A mask was worn while focusing on the soundscape and created a sense of ritualism which I picked up from the alien-like mantras. Sarah’s projection mapping was scaled intimately and drew me in for several minutes, watching the simple cube structure slowly shift in context. Overall this was a fun show and I was really inspired to dig into Unity over this Summer to start creating immersive environments in VR.

Tangible Test

Test post!

-Stephen

Cultural Event 1 – Solastalgia

Solastalgia – 3/31/17

Dateline: 3004 Larimer St.

Denver, CO

Jayne Butler / Molly Lofton / Noah Travis Phillips / Sarah Richter / Bart Rondet / Anna Winter

A mixed exhibition of EDP MFA students featuring virtual reality, digital, tactile and sonic arts.

I had a great time experiencing the wide breadth of work from our program’s graduate students at Solastalgia. I started with Bart’s HTC Vive virtual reality piece, which successfully conveyed a vast sense of space and constant shifts in environment. Molly’s digital art print from Cinema 4D was well lit and did a great job contrasting digitally inspired renderings with a natural mountainous landscape. I liked Anna’s combination of tactility with her video art, and thought that projection mapping the video directly onto the GAK may have made an interesting iteration of this project. Jayne’s jacket camera was well-conceived and raised great questions regarding rights to safety vs. rights to privacy. Noah’s audio experience was deeply immersive. A mask was worn while focusing on the soundscape and created a sense of ritualism which I picked up from the alien-like mantras. Sarah’s projection mapping was scaled intimately and drew me in for several minutes, watching the simple cube structure slowly shift in context. Overall this was a fun show and I was really inspired to dig into Unity over this Summer to start creating immersive environments in VR.

Forgot My Glasses

This one feels finished! I’ve been working hard to learn as many of the functions in Audiotool as possible. It’s turned out to be a powerful program for creating unique sounds from scratch. This piece is composed almost entirely out of Audiotool, each sound was designed independently with a variety of synths and pedals. I made a higher pitch solo background melody that I mapped to the keyboard through Reaktor, then combined the Audiotool and Reaktor files in Audition to master it. FYI – you can download Reaktor for free and also throw some free instrument plugins into the program right off their main site. Its super limited in its capabilities in free mode but you can still generate an amazing variety of sounds with their free plugins.

Final – EDP Logo Generative Animation

I’m pretty happy with how my final project turned out. We were asked to bring in a few generative animation examples to use as creative inspiration for our final project. I brought in this example, and it didn’t take long for us to crack the code of how it was accomplished in class. I took this resampled code and started working on expanding it’s premise which was pretty basic. Understanding classes and how to combine them with arrays was pivotal for developing this project.

 

Chris showed us how to detect image attachments in p5js by telling a particle to sense the image by color percentages. This opened up a new possibility to animate a ton of pixels and mask them until they sensed certain images that I preloaded in the sketch. Here are the 4 images I designed, each had a different class of particles interacting with them to create a multilayered effect:


edp_grid3 edp_logo_outline edp_logo_red_solo edp_logo_red

 

 

 

 

 

 

 

 

 

 

 


By using variable speeds that were linked to perlin noise I was able to create some pretty fun effects. Perlin noise is very sensitive, so dialing in the right numbers took a very long time. There are 4000 total pixels at work in this animation so it gets a little bogged down, as they are constantly searching for the red to trace in the image examples I posted above. I’m happy with my progress in p5 for this quarter and am excited to keep digging into programming.

screenshot_final

Arduino & Potentiometers

We started messing around with Arduino, a type of microprocessing kit that can be programmed to control devices. I’m pretty illiterate when it comes to electricity and how connections work, so this is a good opportunity for me to gain some basic knowledge of circuitry. We plugged a few things into our breadboards, made lights flash and then starting messing with a sliding potentiometer. This is where my interest is peaking, because I may be able to program a couple of knobs to interact with my final Sonic Arts animated projection. I’m pretty nervous about hooking this stuff up to my laptop, as we were warned a bad connection can fry your USB port.

I was provided a tutorial on how to connect p5js to a serial input, which is what I will need to do if I want to make my animation interactive. I really need to run through some more experiments with my SparkFun Kit to get comfortable with this stuff and still feel a little lost. We were provided a really nice introduction tutorial for the kit here.

Meanwhile, I have done a rough p5js sketch of my Sonic Arts projection, the only thing I’ve animated is some twinkling stars. The idea would be the blue ring would orbit around the planet and pulse to the music’s amplification levels. Ideally, users would be able to turn knobs and effect the color and intensity of the animation.

perpetuum

p5js – Perlin Noise, Audio & Functions

Screenshot from my Perlin Noise driven animation, which syncs with any accompanying soundtrack's amplitude.

Click the image to view the animation. // Screenshot from my Perlin Noise driven animation, which syncs with any accompanying soundtrack’s amplitude.

We messed around with Perlin Noise in p5js last week and I got pretty excited at the possibilities of random generative animation. I kept messing with this concept and have a couple animations set up to link with audio. The amplification of the audio basically affects the speed and size of the animated shapes. I’m pretty shocked that I’ve gotten to my goal of tying these elements together, and will hopefully have p5 incorporated as an animated, interactive projection into my final audio piece for Sonic Arts.

 


 

Initial Robot Head Sketches

Initial Robot Head Sketches

Moving on, we jumped into creating custom functions which I can already tell is a powerful tool. Our project was to create a 20×20 grid of 4 different robot heads, with 4 different audio cues tied to clicking with the mouse. I’m happy to say I wrote the code without a lot of hangups, but I got stuck on assigning the audio cues.

 


 

20 x 20 Grid of talking robot heads.

Click the image to load the robot audio sketch. // 20 x 20 Grid of talking robot heads.

For “mousePressed” I nested my “if” statements and rendered them inoperable, but got it fixed up with my instructor’s guidance *wink wink* and some help from my colleague, Rich. I’m trying to go the extra mile and make every individual robot head clickable, as of right now just each column is clickable.

 


 

My first time writing a custom function, I like these glowing Halloween Heads.

Click the image to view the animation! // My first time writing a custom function, I like these glowing Halloween Heads.

I’m looking forward to learning Arduino this week, as it may be my gateway into figuring out a way to give my Sonic Arts installation that extra level of interactivity through touch activation. I want to tie my p5js code to physical objects that will signal the animation to change color and amplitude. We will see if I learn enough to get it done…

 

Functions for Halloween

Happy Halloween! Have some glowing creepy smiley faces.

Robot Functions

Click on any column for a custom robot noise. I made each robot noise in Audition, just tweaked my voice with some reverb and effects.