Generative Soundscape 0.2.1 – IR Comm & Circuit Prototyping

This was an iterative attempt towards a generative soundscape installation. I created everything from UX Design, concepting, PCB design, hardware prototyping, hardware fabrication and ensemble, testing and validating

Deliverables: Hardware prototypes (PCB modules)

Tools: Arduino, Eagle CAD, Other Mill

Brief

After a failed attempt of creating modules that would communicate through sound, I started looking into Infrared Communication.

This is a project that looks to experiment with Infrared Communication between various Modules. All modules come from the same design made in Eagle CAD. It is a through-hole board routed with The Other Mill. The overall board involves an embedded Arduino (ATMega328), 2 IR-Rx, 3 IR-Tx, a LED and a Push-Button.

Modules can transfer the code from any IR Remote and transfer it among their closest peers. 

Next Steps

Evaluate power consumption to figure how can they be powered through batteries.

 

 

 

Previous Iterations

Ideation Tool – Redesign & Interactive Prototype

Previous Iteration

 

Call for Interaction

Having in mind Mobile Devices peripherals, I set onto creating a more playful interaction. An interaction from the intersection between common gestures in the real world and Mobile Phone's accelerometer.

I've borrowed two playful gestures from daily observations the reflexive spinning and the swift juggling. Two gestures that could be technically feasible and experientially engaging.

UI Redesign

I've decided to simplify the interface towards the new experience. The accelerometer remixes the images, and a tap shows the prompt from a generated text. This text is a computational mix from the description of the objects shown on screen. This way, people can be inspired visually and textually.

Interactive Prototype

The interactive prototype was made in Javascript using Cooper Hewitt Museum's API, a RiTa a toolkit for computational literature and the p5js library. This is where the Interactive Prototype can be experienced.

Through Javascript, I'm retrieving all the data from the Cooper Hewitt Museum including images and text from their online exhibition data base. I clean the information and select a topic, in this case, all objects in the museum related to 3D Printing

Gestures

Turns out the spinning gesture is one of the blind spots in Phone Accelerometers. This is why, the prototype will only respond to juggling-type gestures

Text Prompt

By retrieving the descriptions from the 3 objects shown in the screen, I create one phrase by remixing the tokens through a set of computational procedures. Every time the images shown change the tokens by which the phrases are created change. 

Even though the prompted phrases have grammatical errors, embracing the computational glitchiness aligns with the overall playful and mind diverting concept of overcoming a creative block.

BEAM Solar Robot

This is an ongoing project to make a sphere spin out of a solar powered motor. The idea behind BEAM Robots (Biology, Electronics, Aesthetics and Mechanics) is all robots that are driven by analogue circuits instead of micro-controllers. 

This robot can run through two types of circuits, one that involves a Voltage Trigger and another that involves Diodes (Zener or Signal). In the end we decided to go with the Signal Diode circuit.

The electronic components in this BEAM Solar robot are: Voltaic 2W - 6V Solar panel, a 1F Capacitor, a 6V and 280 mA DC motor, PNP Transistor (2N3906), two Signal Diodes (in series), 2.2K Ω resistor and NPN Transistor (2N2904). How this circuit works is the Capacitor charges until the PNP transistor (06) receives base current through the Signal Diodes and turns on. The NPN transistor (04) turns on and the capacitor is discharged through the motor. As the NPN turns on, the 2.2K resistor starts to supply base current to the PNP and the circuit snaps on. When the capacitor voltage drops below about 1V, the the PNP turns off, the NPN turns off and disconnects the motor from the capacitor which starts to charge up again.

We changed to this motor once we fail trying out a High Efficiency motor (4V and 30mA). Despite this change, the overall torque from the 6V Motor (± 180 gm/s^2) and 1.4cm radius wheels still isn't enough to drive the entire rig (circuit, plastic disc and plastic sphere). Next steps could be getting a more powerful motor, or make the entire robot lighter.

ReSounding the City

This is a performance made thanks to the Graduate Student Organization (GSO) Grant at NYU's Tisch School of the Arts. In collaboration with Daniela Tenhamm-Tejos, Jana L. Pickart and Ansh Pattel we explored body language and psycho-geography in urban spaces. I was the developer behind the Gesture Recognition code. For the official website please visit this link.

With this project I created a series of visual effects that responded to the performer's choreography and the poet's voice and audience interaction. These effects were created in the C++ toolkit known as OpenFrameworks. Here is a sneak-peak of these effects

Performer and Tech Tryout

All Developed Effects

M-Code Box

Concept

How can a fabricated object have an interactive life? The M-Code Box is a manifestation of words translated into a tangible morse code percussion. You can find the code here and what's needed to create one M-Code Box is an Arduino UNO, a Solenoid Motor (external power source, simple circuit) and a laptop with Processing.

Next Steps

There are two paths to take this project further. One is to have an interpreter component, recording its sounds and re-encoding them into words, like conversation triggers. The second is to start thinking on musical compositions by multiplying and varying this box in materials and dimensions.

 

 

Previous Iterations

This project came upon assembling two previous projects, the Box Fab exploration of live hinges and the Morse Code Translator that translates typed text into physical pulses.

Generative Synthesizer Prototype

This is a followup in the Generative Propagation concept. What I intended to answer with these exercises are two questions:

  1.  How can the trigger threshold be physically controlled? (How can the mic’s sensitivity be manipulated?)
  2. How can the tempo be established? (How often should each module emit a sound?)

The trigger threshold can be manipulated through manually controlling the microphone’s gain or amount voltage transferred to the amplifier –Potentiometer to IC–. 

By manipulating this potentiometer, the sensitivity of the microphone can be controlled.

The tempo can be established through timing the trigger’s availability. By setting a timer that allows the a variable to listen again, the speed/rate at which the entire installation reproduces sounds can be established.

Morse Code Translator

Inspired by the "Hi Juno" project, I sought an easier way to use Morse Code. This is why I've created the Morse Code Translator, a program that translates your text input into "morsed" physical pulses. One idea to explore further could be thinking how would words express physically perceivable (sound, light, taste?, color?, Tº)

So far I've successfully made the serial communication and the Arduino's functionality. In other words, the idea works up to Arduino's embedded LED (pin 13). This is how a HI looks being translated into light.


Followup, making the solenoid work through morse coded pulses. You can find the Processing and Arduino code in this Github Repo.

Peru's Pavilion in FILBO 2014

For Bogota's International Book-fair (FILBO) Perú was the honored invitee. Panoramika was commissioned by the Peruvian Cultural Office to create various interactive installations, projection mappings and light designs. I was appointed the creation of one of the three installations crafted by Panoramika. Four screens that would reveal passersby random excerpts from the "Captain Pantoja and the Special Service", nobel laureate Mario Vargas Llosa's comedic novel.

For the implementation of this installation, I developed patches that visually changed text compositions in Quartz Composer, whenever the threshold of an Infrared sensor was triggered by people. The sensor was implemented in Arduino and interfaced to QC.

Systema Solar Live Act

Brief

We were commissioned an Interactive Live Show by the Colombian band Systema Solar. With a team of 3 Creative Technologists we developed different real time visual effects. I was in charge for coding the puppetry controls, the audio-reactive silhouette patches and figuring out best UX practices. We created a VJ deck, from the physical rack to the digital patches.

Overview

To better understand the puppetry possibilities with Kinect, we figure out how Animata worked. After having a first glimpse, I began this patch from scratch in the live software VVVV. Even though I had no previous experience with Kinect or VVVV, this project was evidence of perseverant work, squeezed wit and sought fortune. By the end, there were 3 crafted puppets of Systema Solar's crew (Johnpri –lead singer–, Walter –lead performer & singer– and Corpas –dj/scratcher–)

The VJ Deck

The rack is composed of 1 Kinect, 3 GoPro Cameras, 7 signal converters, 1 MIDI Pad, 1 Mac Mini, 1 Four-Channel Mixer. These 4 signals are the input for the VJ's laptop.

Rehersal

Video Documentation