Ideation Tool – Redesign & Interactive Prototype

Previous Iteration

 

Call for Interaction

Having in mind Mobile Devices peripherals, I set onto creating a more playful interaction. An interaction from the intersection between common gestures in the real world and Mobile Phone's accelerometer.

I've borrowed two playful gestures from daily observations the reflexive spinning and the swift juggling. Two gestures that could be technically feasible and experientially engaging.

UI Redesign

I've decided to simplify the interface towards the new experience. The accelerometer remixes the images, and a tap shows the prompt from a generated text. This text is a computational mix from the description of the objects shown on screen. This way, people can be inspired visually and textually.

Interactive Prototype

The interactive prototype was made in Javascript using Cooper Hewitt Museum's API, a RiTa a toolkit for computational literature and the p5js library. This is where the Interactive Prototype can be experienced.

Through Javascript, I'm retrieving all the data from the Cooper Hewitt Museum including images and text from their online exhibition data base. I clean the information and select a topic, in this case, all objects in the museum related to 3D Printing

Gestures

Turns out the spinning gesture is one of the blind spots in Phone Accelerometers. This is why, the prototype will only respond to juggling-type gestures

Text Prompt

By retrieving the descriptions from the 3 objects shown in the screen, I create one phrase by remixing the tokens through a set of computational procedures. Every time the images shown change the tokens by which the phrases are created change. 

Even though the prompted phrases have grammatical errors, embracing the computational glitchiness aligns with the overall playful and mind diverting concept of overcoming a creative block.

Solar Data Logger

Concept

What could be a way to log ITP's entrance and see the difference between the elevators' use and the stairs'? Through a solar powered DIY Arduino, we decided to visualize this data (and store it in a .csv table) in the screen between the elevators at ITP's entrance.

Development

After creating a DIY Arduino that could be powered through solar energy, by following Kina's tutorial we were able to set a basic solar rig that would charge the 3.7V and 1200 mA LiPo battery. We connected the solar panels in series and ended up with an open circuit voltage of 13V. Our current readings however, were of 4 mA.

We hooked the Arduino data to a Processing sketch that would overwrite the table data of a .csv file every second. All of the code can be found in this link.

ReSounding the City

This is a performance made thanks to the Graduate Student Organization (GSO) Grant at NYU's Tisch School of the Arts. In collaboration with Daniela Tenhamm-Tejos, Jana L. Pickart and Ansh Pattel we explored body language and psycho-geography in urban spaces. I was the developer behind the Gesture Recognition code. For the official website please visit this link.

With this project I created a series of visual effects that responded to the performer's choreography and the poet's voice and audience interaction. These effects were created in the C++ toolkit known as OpenFrameworks. Here is a sneak-peak of these effects

Performer and Tech Tryout

All Developed Effects

M-Code Box

Concept

How can a fabricated object have an interactive life? The M-Code Box is a manifestation of words translated into a tangible morse code percussion. You can find the code here and what's needed to create one M-Code Box is an Arduino UNO, a Solenoid Motor (external power source, simple circuit) and a laptop with Processing.

Next Steps

There are two paths to take this project further. One is to have an interpreter component, recording its sounds and re-encoding them into words, like conversation triggers. The second is to start thinking on musical compositions by multiplying and varying this box in materials and dimensions.

 

 

Previous Iterations

This project came upon assembling two previous projects, the Box Fab exploration of live hinges and the Morse Code Translator that translates typed text into physical pulses.

Ideation Tool from Cooper Hewitt Museum API

This project is an ongoing pursue around the question of how to overcome a creative block? Partnered with Lutfiadi Rahmanto, we started out scribbling, sketching and describing the problem to better understand what it meant for each of us and how do we scope this problem and usually respond to it.

UX Research

From the first session we were able to narrow the idea onto a determined goal: A tool to aid inspiration in the creative process. This led us to consider various things around the sought scenario and allowed us to start asking other creatives around this. We sought to better understand –qualitatively– how creatives describe a creative block and more importantly how is creative block overcome? From this session we were also able to reflect on how to aid that starting point of ideating, often a hard endeavor. A resonating answer in the end, was through linking non-related words, concepts or ideas.

We also researched two articles with subject matter experts about Creative Block and Overcoming It ("How to Break Through Your Creative Block: Strategies from 90 of Today's Most Exciting Creators" and "Advice from Artists on Hot to Overcome Creative Block, Handle Criticism, and Nurture Your Sense of Self-Worth"). Here we found a collage between our initial hypothesis with additional components such as remix, from Jessica Hagy's wonderful analogical method of overcoming her creative block by randomly grabbing a book and opening it a random page and linking "the seed of a thousand stories". Another valuable insight was creating space of diverted focus from the task at hand generating the block. We also found a clear experience-design directive for our app, to balance between constrain –structured scrambled data from the API– and freedom –imaginative play–.

Brief, Personas and Scenarios

After validating our intuitive hypotheses on how to address the problem through the contextual inquiries and online articles we came up with a solid Design Brief:

Encourage  a diverted focus where people are able to create ideas by scrambling data from the Cooper Hewitt's database into random ideas (phrases). 

Through this research we created seven different behavior patterns and mapped them onto this two-axis map, that defines the extent to which personas would behave between casual/serious and unique/remix

For a more detailed description of these archetype behaviors visit this link

This enabled us to create our guiding design path through what Lola Bates-Campbell describes as the MUSE. An outlier persona to direct and answer the usual nuances behind designing, in this case, our mobile application tool to aid Mae Cherson in her creative block. We determined her goals and thus her underlying motivations, what she usually does –activities– during her creative environment and how she goes around between small and greater creative blocks in her working space. We also describe her attitudes towards this blocking scenario and how her feelings entangle whenever seeking for inspiration. There were some other traits  determined as well that can be reach in more detail through this link.  Overall we crafted this Muse as a reference point for creating an inspirational experience for the selected archetypes –The Clumsy Reliever and The Medley Maker–.

Engagement

Parallel to the archetypes mapping, we began thinking how to engage our audience –Artists, Designers, Writers, Thinkers, Makers, Tinkerers, all poiesis casters–. Soon we realize the opportunity of captivating our audience through a game-like interaction. A gameplay that requires simple gestures and encourages discoverability. Some of the games we took as reference are Candy Crush and 2 Dots. Two simple games that have out-stand for their heavily and widespread engagement.

Wireframe Sketches

By having research cues and possible game-like affordances in mind there's proliferous space to weave tentative design solutions. Hence we made a couple whiles to sketch layouts, concepts, poetic interactions and nonsense infractions.

On the other side we created sense and sought a balance between amusement and feasibility. At the end of this session we came up with three Design Layout Concepts and general Affordances (call to interaction): Linking, Discovering and Dragging.

Test Insight

From these concepts we started making interactive prototypes. While creating the Discovering prototype, we realize people's intuitive mental model beneath a Candy Crush-like interaction did not match with our design intent, and trying to match it resulted overly complicated and forced. This is why we created prototypes for the Linking and Dragging concepts.

Prototypes

Another prototype explores the underlying preference between text-driven inspiration and visually-driven inspiration. While testing these prototypes we realize some people tent to feel more inspired by imaging the words from a text, and other people feel more inspired by visual queues. This prototype allows both explorations.

The next step is to select one gameplay interaction from our user tests and sintactically address the text data from the API. 


This is another interaction mode –Remixing Mode–, thought after Katherine's valuable feedback on our final prototype that can be accessed in this link.

Palindrome Hour Web-Clock

This is a project that celebrates hours that can be read either from left-to-right and right-to-left, same as palindrome text –flee to me, remote elf–. A concept of living symmetry overlaid with pleasing coincidence, and chunks of daily serendipity. 

 I designed & coded this project in Javascript with the creative toolkit p5.js. Hop in, and catch the palindrome hours! Link To Project Here

Previous Iteration

UI Drafts

UI Draft #2 BCI & Processing

This is the Interactive Wireframe so far, for my BCI Interactive Installation. Basically I'm trying ways to better communicate what's going on when using the Mindwave, and how can we translate its signal into a more structured task. The code for this UI Wireframe can be found in this Github Repo.

Morse Code Translator

Inspired by the "Hi Juno" project, I sought an easier way to use Morse Code. This is why I've created the Morse Code Translator, a program that translates your text input into "morsed" physical pulses. One idea to explore further could be thinking how would words express physically perceivable (sound, light, taste?, color?, Tº)

So far I've successfully made the serial communication and the Arduino's functionality. In other words, the idea works up to Arduino's embedded LED (pin 13). This is how a HI looks being translated into light.


Followup, making the solenoid work through morse coded pulses. You can find the Processing and Arduino code in this Github Repo.

 

NUI BCI Study #1 "Mindwave"

 

Through this first exploration of interfacing Neurosky's Mindwave I've learned a couple things around EEG and Processing. The current library I'm working with is called Thinkgear, which allows to read different signals (low and high values for alpha, beta and gamma, and delta and theta signals, plus a blinking signal). Besides the annoying bluetooth pairing, this consumer interface is still in the making and Processing's latency doesn't make it easier for user feedback. I'm sure there's better ways of interfacing this to optimize user feedback –other software–, and there should be better consumer EEG devices out there. Nonetheless it has been a thrilling experience to better understand the sine and cosine functions, arrays and libraries. Here's my second draft I've crafted with this curious Natural Brain Computer Interface. The code for this UI Draft can be found in this Github Repo.

UI Draft #1

With the opensource JAVA toolkit Processing, I started exploring around User Interfaces, Time representation and Hover Timing. Hover Timing, might bring intersting possibilities for Natural User Interface such as the Kinect or Leap Motion, where different affordances come into place wtih simple tasks like selecting an element. The code for this draft can be found in this Github Repo

Peru's Pavilion in FILBO 2014

For Bogota's International Book-fair (FILBO) Perú was the honored invitee. Panoramika was commissioned by the Peruvian Cultural Office to create various interactive installations, projection mappings and light designs. I was appointed the creation of one of the three installations crafted by Panoramika. Four screens that would reveal passersby random excerpts from the "Captain Pantoja and the Special Service", nobel laureate Mario Vargas Llosa's comedic novel.

For the implementation of this installation, I developed patches that visually changed text compositions in Quartz Composer, whenever the threshold of an Infrared sensor was triggered by people. The sensor was implemented in Arduino and interfaced to QC.

Systema Solar Live Act

Brief

We were commissioned an Interactive Live Show by the Colombian band Systema Solar. With a team of 3 Creative Technologists we developed different real time visual effects. I was in charge for coding the puppetry controls, the audio-reactive silhouette patches and figuring out best UX practices. We created a VJ deck, from the physical rack to the digital patches.

Overview

To better understand the puppetry possibilities with Kinect, we figure out how Animata worked. After having a first glimpse, I began this patch from scratch in the live software VVVV. Even though I had no previous experience with Kinect or VVVV, this project was evidence of perseverant work, squeezed wit and sought fortune. By the end, there were 3 crafted puppets of Systema Solar's crew (Johnpri –lead singer–, Walter –lead performer & singer– and Corpas –dj/scratcher–)

The VJ Deck

The rack is composed of 1 Kinect, 3 GoPro Cameras, 7 signal converters, 1 MIDI Pad, 1 Mac Mini, 1 Four-Channel Mixer. These 4 signals are the input for the VJ's laptop.

Rehersal

Video Documentation

Interactive Table – ∏ "Planta Interactiva"—

Overview

This project blends the Interactive-Tabletop framework known as Reactivsion, with an engaging way of explaining Biodiesel production. I was the Full-Stack Designer creating the UX, storyboards, 3D motion graphics and Creative Coding like animated buttons and knobs.

The objective behind this exploration is to engage potential new engineering students in an experience that helps them understand what involves any of the engineering programs offered by the faculty. Each of the 4 offered programs at Universidad de la Sabana has a specific narrative within the playful experience of creating biodiesel.

Storyboard

Tryout