Augmented Reality mobile app developed for the launch event of the new shoe line "009" by New Balance.
A booth with three shoe stands displaying an image of the 009 shoe imprint were placed at a booth during the New Balance 009 launch event. Using our mobile app and smartphones, guests could see a 3D model of the shoe projected onto the imprint of the shoe.
Guests could take the AR experience home with them in the form of a 4”x4” coaster. The AR app could be downloaded by following a download link printed on the coaster.
Guests had the ability to change the color of the shoe from within the app, zoom in on the shoe to see it in finer detail, and purchase the model from within the app.
Project made in in collaboration with Dylan Filingeri and Neighborhood Watch.
Technology: Unity 3D + Vuforia, 3D scanning, Maya
New Balance AR Experience - nblifestyle Instagram
New Balance AR Experiece
HOW ANIMALS SEE: INSIDE THE EXTRAORDINARY WORLD OF ANIMAL VISION
WORK IN PROGRESS
A web-based VR experience exploring the differences between human and animal vision by providing an artistic interpretation of how the world might appear to one of four animals: a bee, a dog, a bat, and a rattlesnake, as well as how the visual perception of each animal enhances their ability to find food or detect their predator or prey.
SHY ART SHOW
Shy Art Show is a proposal for an art exhibition displaying a collection of ‘shy’ objects that actively avoid being looked at. The goal of the exhibition is to force the viewer to become aware of their process of experiencing the artwork and take time to appreciate the objects on display without the distraction of mobile phones and other electronic devices.
Interactive plan of the exhibition space made in Unreal Engine 4.
nth wall is a multimedia performance exploring the intersection of the physical world and virtual reality. The project is a result of a collaboration between Joanna Wrzaszczyk, Nicholas Bratton, Zhen Liu, and Dylan Filingeri.
New technologies such as Oculus and augmented reality mobile apps have forced us to confront the digital and the physical -- to distinguish between the online and the offline. These two worlds often appear separate; the digital world is ‘virtual’ and the physical world is ‘real’. Through this project we try to merge the material reality with digital information and blur the boundaries between these two worlds.
The finished product is a platform for the exploration of the permeable borders between virtual and real. It is a virtual reality world that is set in a real space and allows for real life interactions with objects and people, enabling the user to physically cross from one world to the other. We use mixed-reality boundaries to understand and disrupt human perception and trigger new ways of making sense out of the world around.
Project made in Unreal Engine 4. Motion capture recorded with OptiTrack and Motive.
nth wall - world walk-through
The Nth Wall : Interaction between mocap performer & VR immersee
Mindful Lamp is a prototype of a 3D printed smart health appliance that can interpret user's emotion by accessing their social media accounts and analyzing the types of sentiments in their feeds. Project developed in collaboration with Jedy Chen, Martin Romero, and Stream Gao.
For most of the current smart health/health monitoring devices service, there is an unbalance between data input+analysis module and feedback module. That is to say, they focus on the solutions of how to get information and transform to readable data to customers, while lacking in what they can do with these readable data. Mindful Lamp server automatically collects emotion types from the user's social media and calculates the average 'sentiment' score (sentiment analysis, image recognition to get emotion values). The lamp reacts to the collected data by changing its color and shape in order to help the user meditate and improve his/her mental well being.
The ideal way to collect user's emotions is user-passive, unconscious, with the minimum user input. Based on these rules, we decide to use public feeds in social account and speech/tone analysis. There is also a way for user to input their mood directly. The lamp is using an Alchemy API to analyze emotions.
Arduino Yun serves as the general Mindful Lamp controller, controlling its wireless module (WiFi), lighting module, and movement module. The motion of the lamp is controlled by a combination of one big gear and five small gears, which are powered by a step motor.
MINDFUL LAMP - Documentation
Mean Meal is an interactive video built with Treehouse by Interlude .
CLICK! on buttons embedded in the video and choose your own narrative!
Project developed with Brett Stiller and Nikolaj Slot Petersen.