Ideation Tool – Redesign & Interactive Prototype

Previous Iteration

 

Call for Interaction

Having in mind Mobile Devices peripherals, I set onto creating a more playful interaction. An interaction from the intersection between common gestures in the real world and Mobile Phone's accelerometer.

I've borrowed two playful gestures from daily observations the reflexive spinning and the swift juggling. Two gestures that could be technically feasible and experientially engaging.

UI Redesign

I've decided to simplify the interface towards the new experience. The accelerometer remixes the images, and a tap shows the prompt from a generated text. This text is a computational mix from the description of the objects shown on screen. This way, people can be inspired visually and textually.

Interactive Prototype

The interactive prototype was made in Javascript using Cooper Hewitt Museum's API, a RiTa a toolkit for computational literature and the p5js library. This is where the Interactive Prototype can be experienced.

Through Javascript, I'm retrieving all the data from the Cooper Hewitt Museum including images and text from their online exhibition data base. I clean the information and select a topic, in this case, all objects in the museum related to 3D Printing

Gestures

Turns out the spinning gesture is one of the blind spots in Phone Accelerometers. This is why, the prototype will only respond to juggling-type gestures

Text Prompt

By retrieving the descriptions from the 3 objects shown in the screen, I create one phrase by remixing the tokens through a set of computational procedures. Every time the images shown change the tokens by which the phrases are created change. 

Even though the prompted phrases have grammatical errors, embracing the computational glitchiness aligns with the overall playful and mind diverting concept of overcoming a creative block.

Mind the Needle — Popping Balloons with Your Mind 0.2

 

Concept

Time's running out! Will your Concentration drive the Needle fast enough? Through the EEG consumer electronic Mindwave, visualize how your concentration level drives the speed of the Needle's arm and pops the balloon, maybe!

Second UI Exploration

Second UI Exploration

Development & UI

I designedcoded and fabricated the entire experience as an excuse to explore how people approach interfaces for the first time and imagine how things could or should be used.

The current UI focuses on the experience's challenge: 5 seconds to pop the balloon. The previous UI focused more on visually communicating the concentration signal (from now on called ATTENTION SIGNAL)

This is why there's prominence on the timer's dimension, location and color. The timer is bigger than the Attention signal and The Needle's digital representation. In addition this is why the timer is positioned at the left so people will read it first. Even though Attention signal is visually represented the concurrent question that emerged in NYC Media Lab's "The Future of Interfaces" and ITP's "Winter Show" was: what should I think of? 

Showcase

Insights

What drives the needle is the intensity of the concentration or overall electrical brain activity, which can be achieved through different ways, such as solving basic math problems for example –a recurrent successful on-site exercise–. More importantly, this question might be pointing to an underlying lack of feedback from the physical devise itself, a more revealing question would be: How could feedback in  BCIs be better? Another reflection upon this interactive experience was, what would happen if this playful challenge was addressed differently by moving The Needle only when exceeding a certain Attention threshold?

Previous Iterations

Generative Soundscape Concept

When studying Interactive Technologies at NYU’s Interactive Telecommunications Program, I was working towards an interactive installation that blended the bowling gesture to trigger scattered half-spheres to generate a musical experience. This is an evolved and collaborative idea, from the Generative Sculptural Synth. The ideal concept is an interactive synthesizer that's made up of replicated modules that generate sound. It is triggered by sphere that creates chain-reaction throughout the installation's configuration.

 

Methodology: Iterative Concept and Prototyping

Tools: Arduino and Little Bits

Deliverables: Prototype of a modules that communicate and trigger through motion and sound range

It started out as re-configurable soundscape and evolve into an interactive –bocce-like– generative instrument. Here's a inside scoop of the brainstorming session were we –with my teammate– sought common ground. (1. Roy's ideal pursuit 2.My ideal pursuit 3.Converged ideal)

Audio Input Instructable

It started out as re-configurable soundscape and evolve into an interactive –bocce-like– generative instrument. Here's a inside scoop of the brainstorming session were we –with my teammate– sought common ground. (1. Roy's ideal pursuit 2.My ideal pursuit 3.Converged ideal)

Littlebits –whatever works–

After the slum dunk failure of the DIY Audio Input, I realize the convenience –limited– of prototyping with Littlebits. This way, I could start concentrating in the trigger event, rather than getting stuck at circuit sketching. I was able to program a simple timer for module to "hear" –boolean triggered by the microphone– and a timer for the module to "speak" –boolean to generate a tone–. What I learnt about the limitations of the Littlebit sensor is a twofold. They have a Sound Trigger and a conventional Microphone. Both bits' circuits have the embedded circuit solved out which turned out to be useful but limiting. The Sound Trigger has an adjustable Gain, an embedded –uncontrollable– 2 second timer and a pseudo-boolean output signal. So even though you can adjust it's sensibility, you can't actually work around with its values in Arduino IDE. The Microphone bit had an offsetted (±515 serial value) but its gain was rather insensible.

This is why, when conveniently using the Sound Triggers, the pitch is proportional to the distance. In other words, the modules are triggered closer when lower pitches are sensed and vice versa. However, since these bits –Sound Trigger– are pseudo-boolean, there can't be a Frequency Analysis.

Mind the Needle Iteration 0.1

First Iteration of Mind the Needle, an exploration of emergent interfaces.

Mind the Needle is project exploring the commercially emergent user interfaces of EEG devices. After establishing the goal as popping a balloon with your mind –mapping the attention signal to a servo with an arm that holds a needle–, the project focused on better understanding how people approach these new interfaces and how can we start creating better practices around BCIs –Brain Computer Interfaces–. Mind the Needle has come to fruition after considering different scenarios. It focuses on finding the best way communicating progression through the attention signal. In the end we decided to only portray forward movement even though the attention signal varies constantly. In other words, the amount of Attention only affects the speed of the arm moving, not its actual position. Again, this is why the arm can only move forward, to better communicate progression in such intangible, rather ambiguous interactions –such as Brain Wave Signals–, which in the end mitigate frustration.

The first chosen layout was two arcs the same size, splitting the screen in two. The arc on the left is the user's Attention feedback and the other arc is the digital representation of the arm.

After the first draft, and a couple of feedback from people experimenting with just the Graphical User Interface, it was clear the need for the entire setup. However, after some first tryouts with the servo, there were really important insights around the GUI. Even though the visual language –Perceptual Aesthetic– used did convey progression and forwardness, the signs behind it remained unclear. People were still expecting the servo to move accordingly with the Attention signal. This is why in the final GUI this signal resembles a velocimeter.

UI Alternatives

Physical Prototype

Sidenote: To ensure the successful popping-strike at the end, the servo should make a quick slash in the end (if θ ≧ 180º) – {θ = 170º; delay(10); θ = 178;}

Morse Code Translator

Inspired by the "Hi Juno" project, I sought an easier way to use Morse Code. This is why I've created the Morse Code Translator, a program that translates your text input into "morsed" physical pulses. One idea to explore further could be thinking how would words express physically perceivable (sound, light, taste?, color?, Tº)

So far I've successfully made the serial communication and the Arduino's functionality. In other words, the idea works up to Arduino's embedded LED (pin 13). This is how a HI looks being translated into light.


Followup, making the solenoid work through morse coded pulses. You can find the Processing and Arduino code in this Github Repo.

Tangible Retail Display

Images taken from their Blog Post

Images taken from their Blog Post

After a lot of searching and looking around, I stumble upon a company that creates interactive products for commercial scenarios beyond tactile interfaces onto tangible ones. The interactive product is triggered by lifting one of the products sold in the store, to expose an album of first-person stories around diverse brand’s products. Even though it sets an innovative consumer experience, after half an hour of waiting for someone to comply, I finally decided to take it for a spin. The product is a sealed black box, with what I imagine is a projector, a computer and a camera. The main idea behind it is to transform any surface into an interactive tangible user interface. Basically this is a usable interactive experience with catchy stories behind a tracking framework.

The fact this product is interfacing with real tangible artifacts does set an entire realm of possibilities, even though it was only used for triggering a strictly tactile command interface. This tactile-2D-interface had the proper affordances to easily manipulate the experience. Its results could easily be noticed when navigating and selecting different features, and because it was built on top of the tactile interface paradigm, it was really easy to learn how to use it. However, it lacked the first principle of interaction design, it wasn’t perceivable as an interactive display at first sight. Not really sure why, but its call to action –or its lack of– left clients adrift. Even still when the product had a blinking text prompt of 1/10 of the display’s height –more less– for inviting people to interact –"Please lift to read the stories"–, the overall idea of how to start the interactive experience wasn’t overly persuasive. Maybe, given to the fact that it resembled a light-display-installation that you’re not supposed to touch kind-of imaginary scenario, but not 100% certain. Overall the 5 minute experience was entertaining.

The hypothesis I had before approaching the product was that this interface should aim for what Norman calls affective approach, considering the context and goal are for retail purposes, it is not a scenario that requires a serious concentrated effort reach its goal. In these order of ideas, the product balances beauty and usability fairly well, where easy-going use and contemplation are conveyed.

IxD Principles

 
 

The answers should be given by the design, without any need for words or symbols, certainly without any need for trial and error.” Don Norman

The answers Don Norman addresses are PERCEIVED through affordances. As he describes it, these affordances are “primarily those fundamental properties that determine just how the thing could possibly be used [, and] provide strong clues to the operations of things”. Thus, affordances allow the transition from the first principle to the second (Perceivability to PREDICTABILITY). It’s thanks to these visible assets in products –affordances–, that people are able to interact (operate and manipulate). Given to FEEDBACK (the third principle) people can understand and know how to overcome error (machine’s) and mistakes (people’s). Through repeated interaction, people get to LEARN how to use a product, and thanks to CONSISTENT standard practices among similar products, transfer its usage from one type of product to another.

Besides use standards and best practices, Don Norman addresses the importance of affection in the design process. He points the nuances between negative and positive affection, and draws the importance of creating good human-centered design whenever addressing stressful situations. In the end, he emphasizes that “[t]rue beauty in a product has to be more than skin deep, more than a façade. To be truly beautiful, wondrous, and pleasurable, the product has to fulfill a useful function, work well, and be usable and understandable.

What is Interaction?

I like to think of Interaction Design as the work towards creating models/experiences that attempt to closely represent people's imagination or conceptual models. Chris Crawford’s metaphor of conversation is the most concise and enlightening explanation I’ve read so far. Luminaries within the Interaction Design realm such as Bill Moggridge or Gillian Crampton have wonderful explanations, yet Crawford’s self contained metaphor gives IxD’s explanation an elegant simplicity with just one word. From the implications of conversation that Crawford describes, there are a couple concepts to highlight. The cyclic nature of the conversation between actors, and fun as key qualitative factor for high interactive designs. To guarantee this cycle, he addresses the importance of the 3 equally necessary factors –listen, think and speak– to consider a conversation good. This is certainly an entertaining challenge when designing interactive works.

Crawford goes on pinpointing the revealing differences between IxD and other similar disciplines such as Interface Design. This difference relies specifically in the in between factor of a conversation, thinking. Interaction Design differs from Interface Design by addressing how will the work behave, through algorithms. He ensembles an articulate comparison that sets the stage for an afterthought analogy, Interaction Design is to Interface Design as Industrial Design to Graphic Design. He describes that, “[...] the user interface designer considers form only and does not intrude into function, but the interactivity designer considers both form and function in creating a unified design.” A systemic approach that never gets easy, yet enormously fulfilling whenever “people identify more closely with it [interactive work] because they are emotionally right in the middle of it.” In other words, interaction design is amazing thanks to the engaging and earnest-provoking experience.

In the end, Crawford finishes with a cautious call for action encouraging the reader to “exploit interactivity to its fullest and not dilute it with secondary business.” Exactly what prodigious creator and visionary Bret Victor denounces about nowadays consumer tech panorama. He is alarmed by the status quo’s acceptance of the narrow vision in interaction’s future-concept behind a flat surface. Victor advocates for tools that “addresses human needs by amplifying human capabilities”. Its through everyday objects’ properties how Interaction Design feedback should be crafted. He wittily highlights haptic feedback and explains haptic typology –power, precision and hook grips–. These premises will allow Interaction Design craft more intuitive works where hopefully people can seamlessly converse with –fingers crossed– other people and seamlessly experience works and devices. Victor wraps it with an encouraging suggestion to “be inspired by the untapped potential of human capabilities” and as Interaction Design “[w]ith an entire body at your command, do you seriously think the Future Of Interaction should be a single finger?

Even though gestured Natural Interfaces cast an interesting future for Interaction such as Disney Research's lovely concept, there is still fine tuning within the Beneficial Aesthetic realm.

Aireal: Interactive Tactile Experiences in Free Air. (n.d.). Retrieved September 9, 2014.

Crawford, C. (2002). The Art of Interactive Design a Euphonious and Illuminating Guide to Building Successful Software. San Francisco: No Starch Press.

Victor, B. (2011, November 8). A Brief Rant on the Future of Interaction Design. Retrieved September 8, 2014.

Lab Time App

With Pinedot Studios we created a pick-up scheduling app, to aid clinical labs organize their samples. I was the Product Designer (UI/UX) behind it. It is natively available in iOS and Android. We decided to build on top of the current scheduling system which categorizes samples by color and letters. This way, we ensured the the application could be easily learned by the target audience, in this case Lab Assistants and Managers.

Unisabana App

By mid 2014, along with Jenny Robayo and a team of students, we launched the official mobile application for Universidad De La Sabana. It was a creation process that involved a Human Centered Design approach and a semester of hard work. I was the Product Designer behind it, from UX Research to UI Development, good old generalist.

Methodology: Human Center Design

Methods: Focus Groups, Interviews, Stakeholder Prioritization, Design Principles, Personas, Wireframes, Mockups & Prototypes

Tools: Adobe Illustrator, Keynote, G-Suite

Deliverables: Research Analysis, Information Architecture, Screen Designs and Assets

From left to right, Rafael Rodriguez (Integration Lead), Ricardo Sotaquira (Information Technology Engineering Director and Project's Director), Francisco Ramirez (Design Lead), Jenny Robayo (Development Lead), Juan Pablo Velasquez (Student), Alejandro Zambrano (Student). Two students missing: David Piñeros and Nicolas Guzman

Even though this project involved NDAs, the overall process began with the core team (Director and Leads) drafting the Project's Proposal and Design Concept. After an agreed Brief with the stakeholders, we conducted User Research to validate our initial hypotheses and discover new opportunities. 

The video at the left is an excerpt from the User Research activities I conducted. They were held in the first Design phase, hearing our main audience, the student community.

Out of this research we crafted Design Principles that would guide us in the App's UX and UI. We also created personas that aided the scenarios.

By overlapping Needs from users and Desires from stakeholders, and the development team's Abilities we decided which features approach first. Once having everyone on the same page, I created tentative layouts through Wireframes and Mockups, that I translated later into tap-through prototypes.

We delivered a Mobile App with a set of features with breakthrough feature to encourage students to improve their performance (virtual academic advisor) and the campus community's daily activities (news, menu, events). The advisor is a tool that will suggest what should a student aim for his next graded assignment (exams, projects, homework) according to a desired course-grade.

This is a run through the different services, natively developed in iOS and Android available for download in their corresponding links

Systema Solar Live Act

Brief

We were commissioned an Interactive Live Show by the Colombian band Systema Solar. With a team of 3 Creative Technologists we developed different real time visual effects. I was in charge for coding the puppetry controls, the audio-reactive silhouette patches and figuring out best UX practices. We created a VJ deck, from the physical rack to the digital patches.

Overview

To better understand the puppetry possibilities with Kinect, we figure out how Animata worked. After having a first glimpse, I began this patch from scratch in the live software VVVV. Even though I had no previous experience with Kinect or VVVV, this project was evidence of perseverant work, squeezed wit and sought fortune. By the end, there were 3 crafted puppets of Systema Solar's crew (Johnpri –lead singer–, Walter –lead performer & singer– and Corpas –dj/scratcher–)

The VJ Deck

The rack is composed of 1 Kinect, 3 GoPro Cameras, 7 signal converters, 1 MIDI Pad, 1 Mac Mini, 1 Four-Channel Mixer. These 4 signals are the input for the VJ's laptop.

Rehersal

Video Documentation

Enrutate iOS App

Back in 2013 with two developers, we created this Mobile App. I designed the UI including the simplified map of Bogota and the promotional motion graphics. It was the first Mobile Application to interact directly with the map. A more perceivable and predictableintuitive– interaction comparing it with the competition (public transport apps in Bogota, Colombia). 

Interactive Table – ∏ "Planta Interactiva"—

Overview

This project blends the Interactive-Tabletop framework known as Reactivsion, with an engaging way of explaining Biodiesel production. I was the Full-Stack Designer creating the UX, storyboards, 3D motion graphics and Creative Coding like animated buttons and knobs.

The objective behind this exploration is to engage potential new engineering students in an experience that helps them understand what involves any of the engineering programs offered by the faculty. Each of the 4 offered programs at Universidad de la Sabana has a specific narrative within the playful experience of creating biodiesel.

Storyboard

Tryout