Ricoh Theta
This is an initial 360º Video prototype with Ricoh Theta
This is an initial 360º Video prototype with Ricoh Theta
This is a performance made thanks to the Graduate Student Organization (GSO) Grant at NYU's Tisch School of the Arts. In collaboration with Daniela Tenhamm-Tejos, Jana L. Pickart and Ansh Pattel we explored body language and psycho-geography in urban spaces. I was the developer behind the Gesture Recognition code. For the official website please visit this link.
With this project I created a series of visual effects that responded to the performer's choreography and the poet's voice and audience interaction. These effects were created in the C++ toolkit known as OpenFrameworks. Here is a sneak-peak of these effects
Time's running out! Will your Concentration drive the Needle fast enough? Through the EEG consumer electronic Mindwave, visualize how your concentration level drives the speed of the Needle's arm and pops the balloon, maybe!
I designed, coded and fabricated the entire experience as an excuse to explore how people approach interfaces for the first time and imagine how things could or should be used.
The current UI focuses on the experience's challenge: 5 seconds to pop the balloon. The previous UI focused more on visually communicating the concentration signal (from now on called ATTENTION SIGNAL)
This is why there's prominence on the timer's dimension, location and color. The timer is bigger than the Attention signal and The Needle's digital representation. In addition this is why the timer is positioned at the left so people will read it first. Even though Attention signal is visually represented the concurrent question that emerged in NYC Media Lab's "The Future of Interfaces" and ITP's "Winter Show" was: what should I think of?
What drives the needle is the intensity of the concentration or overall electrical brain activity, which can be achieved through different ways, such as solving basic math problems for example –a recurrent successful on-site exercise–. More importantly, this question might be pointing to an underlying lack of feedback from the physical devise itself, a more revealing question would be: How could feedback in BCIs be better? Another reflection upon this interactive experience was, what would happen if this playful challenge was addressed differently by moving The Needle only when exceeding a certain Attention threshold?
Mind the Needle is project exploring the commercially emergent user interfaces of EEG devices. After establishing the goal as popping a balloon with your mind –mapping the attention signal to a servo with an arm that holds a needle–, the project focused on better understanding how people approach these new interfaces and how can we start creating better practices around BCIs –Brain Computer Interfaces–. Mind the Needle has come to fruition after considering different scenarios. It focuses on finding the best way communicating progression through the attention signal. In the end we decided to only portray forward movement even though the attention signal varies constantly. In other words, the amount of Attention only affects the speed of the arm moving, not its actual position. Again, this is why the arm can only move forward, to better communicate progression in such intangible, rather ambiguous interactions –such as Brain Wave Signals–, which in the end mitigate frustration.
The first chosen layout was two arcs the same size, splitting the screen in two. The arc on the left is the user's Attention feedback and the other arc is the digital representation of the arm.
After the first draft, and a couple of feedback from people experimenting with just the Graphical User Interface, it was clear the need for the entire setup. However, after some first tryouts with the servo, there were really important insights around the GUI. Even though the visual language –Perceptual Aesthetic– used did convey progression and forwardness, the signs behind it remained unclear. People were still expecting the servo to move accordingly with the Attention signal. This is why in the final GUI this signal resembles a velocimeter.
Physical Prototype
Sidenote: To ensure the successful popping-strike at the end, the servo should make a quick slash in the end (if θ ≧ 180º) – {θ = 170º; delay(10); θ = 178;}
Through this first exploration of interfacing Neurosky's Mindwave I've learned a couple things around EEG and Processing. The current library I'm working with is called Thinkgear, which allows to read different signals (low and high values for alpha, beta and gamma, and delta and theta signals, plus a blinking signal). Besides the annoying bluetooth pairing, this consumer interface is still in the making and Processing's latency doesn't make it easier for user feedback. I'm sure there's better ways of interfacing this to optimize user feedback –other software–, and there should be better consumer EEG devices out there. Nonetheless it has been a thrilling experience to better understand the sine and cosine functions, arrays and libraries. Here's my second draft I've crafted with this curious Natural Brain Computer Interface. The code for this UI Draft can be found in this Github Repo.
VICE Colombia started their headquarters in the beginning of 2014. For their launch party they invited Panoramika to create an interactive installation and multiple projection mappings. We created an array of projected eyes mapped onto extruded circles on the wall, that followed viewers. We used Kinect, OpenFrameworks, Quartz Composer and Madmapper.
We were commissioned an Interactive Live Show by the Colombian band Systema Solar. With a team of 3 Creative Technologists we developed different real time visual effects. I was in charge for coding the puppetry controls, the audio-reactive silhouette patches and figuring out best UX practices. We created a VJ deck, from the physical rack to the digital patches.
To better understand the puppetry possibilities with Kinect, we figure out how Animata worked. After having a first glimpse, I began this patch from scratch in the live software VVVV. Even though I had no previous experience with Kinect or VVVV, this project was evidence of perseverant work, squeezed wit and sought fortune. By the end, there were 3 crafted puppets of Systema Solar's crew (Johnpri –lead singer–, Walter –lead performer & singer– and Corpas –dj/scratcher–)
The rack is composed of 1 Kinect, 3 GoPro Cameras, 7 signal converters, 1 MIDI Pad, 1 Mac Mini, 1 Four-Channel Mixer. These 4 signals are the input for the VJ's laptop.
This project blends the Interactive-Tabletop framework known as Reactivsion, with an engaging way of explaining Biodiesel production. I was the Full-Stack Designer creating the UX, storyboards, 3D motion graphics and Creative Coding like animated buttons and knobs.
The objective behind this exploration is to engage potential new engineering students in an experience that helps them understand what involves any of the engineering programs offered by the faculty. Each of the 4 offered programs at Universidad de la Sabana has a specific narrative within the playful experience of creating biodiesel.