Augmented Reality mobile app developed for the launch event of the new shoe line "009" by New Balance.
A booth with three shoe stands displaying an image of the 009 shoe imprint were placed at a booth during the New Balance 009 launch event. Using our mobile app and smartphones, guests could see a 3D model of the shoe projected onto the imprint of the shoe.
Guests could take the AR experience home with them in the form of a 4”x4” coaster. The AR app could be downloaded by following a download link printed on the coaster.
Guests had the ability to change the color of the shoe from within the app, zoom in on the shoe to see it in finer detail, and purchase the model from within the app.
Project made in in collaboration with Dylan Filingeri and Neighborhood Watch.
Technology: Unity 3D + Vuforia, 3D scanning, Maya
New Balance AR Experience - nblifestyle Instagram
New Balance AR Experiece
DILATION is a study of animation techniques of motion capture through dance. The project explores the differences of elapsed time through a sequence of eloquent dance solos and duets - it follows the idea of space-time relativity where there are no absolute positions in time and space. A non-linear journey demonstrates how the body constructs and affects the space and how the perception of movement changes over time. The emotion will intensify from slow conscious sections to rapid explorations until eventual regression.
Project created to be displayed at the IAC Building in New York. The custom made screens measures 11520px x 980px.
Project made in Unreal Engine 4. Models created in Make Human and rigged in Motionbuilder. Modeled in UE4 and Maya. Performances captured using Motive and Opti Track.
Produced in collaboration with Chang Liu. Music by De Ke - "CHAOS". Dance solos performed by Michaela Rae Mann and Berit Ahlgren.
Shy Art Show is a proposal for an art exhibition displaying a collection of ‘shy’ objects that actively avoid being looked at. The goal of the exhibition is to force the viewer to become aware of their process of experiencing the artwork and take time to appreciate the objects on display without the distraction of mobile phones and other electronic devices.
Interactive plan of the exhibition space made in Unreal Engine 4.
nth wall is a multimedia performance exploring the intersection of the physical world and virtual reality. The project is a result of a collaboration between Joanna Wrzaszczyk, Nicholas Bratton, Zhen Liu, and Dylan Filingeri.
New technologies such as Oculus and augmented reality mobile apps have forced us to confront the digital and the physical -- to distinguish between the online and the offline. These two worlds often appear separate; the digital world is ‘virtual’ and the physical world is ‘real’. Through this project we try to merge the material reality with digital information and blur the boundaries between these two worlds.
The finished product is a platform for the exploration of the permeable borders between virtual and real. It is a virtual reality world that is set in a real space and allows for real life interactions with objects and people, enabling the user to physically cross from one world to the other. We use mixed-reality boundaries to understand and disrupt human perception and trigger new ways of making sense out of the world around.
Project made in Unreal Engine 4. Motion capture recorded with OptiTrack and Motive.
nth wall - world walk-through
The Nth Wall : Interaction between mocap performer & VR immersee
3D models of real objects captured with Asus Xtion and Skanect. Edited in Maya and Unreal Engine 4.
The project is an effect of a collaboration between Joanna Wrzaszczyk and Ross Goodwin. The lamp visualizes the traveling salesman problem between a set of 3D printed node-cities modeled in Rhinoceros software. City-nodes are designed based on descriptions of chosen cities from Italo Calvino’s book Invisible Cities.
The Traveler's Lamp is a three-dimensional set of vertices enclosed in an engraved laser cut Plexiglas form with a custom made wooden stand. The vertices are connected with a thin fishing line to LEDs, that visualize a computer algorithm (running on Raspberry Pi) approximating the traveling salesman problem in real time between the node-cities.
Mindful Lamp is a prototype of a 3D printed smart health appliance that can interpret user's emotion by accessing their social media accounts and analyzing the types of sentiments in their feeds. Project developed in collaboration with Jedy Chen, Martin Romero, and Stream Gao.
For most of the current smart health/health monitoring devices service, there is an unbalance between data input+analysis module and feedback module. That is to say, they focus on the solutions of how to get information and transform to readable data to customers, while lacking in what they can do with these readable data. Mindful Lamp server automatically collects emotion types from the user's social media and calculates the average 'sentiment' score (sentiment analysis, image recognition to get emotion values). The lamp reacts to the collected data by changing its color and shape in order to help the user meditate and improve his/her mental well being.
The ideal way to collect user's emotions is user-passive, unconscious, with the minimum user input. Based on these rules, we decide to use public feeds in social account and speech/tone analysis. There is also a way for user to input their mood directly. The lamp is using an Alchemy API to analyze emotions.
Arduino Yun serves as the general Mindful Lamp controller, controlling its wireless module (WiFi), lighting module, and movement module. The motion of the lamp is controlled by a combination of one big gear and five small gears, which are powered by a step motor.