HOW ANIMALS SEE: INSIDE THE EXTRAORDINARY WORLD OF ANIMAL VISION
WORK IN PROGRESS
A web-based VR experience exploring the differences between human and animal vision by providing an artistic interpretation of how the world might appear to one of four animals: a bee, a dog, a bat, and a rattlesnake, as well as how the visual perception of each animal enhances their ability to find food or detect their predator or prey.
DILATION is a study of animation techniques of motion capture through dance. The project explores the differences of elapsed time through a sequence of eloquent dance solos and duets - it follows the idea of space-time relativity where there are no absolute positions in time and space. A non-linear journey demonstrates how the body constructs and affects the space and how the perception of movement changes over time. The emotion will intensify from slow conscious sections to rapid explorations until eventual regression.
Project created to be displayed at the IAC Building in New York. The custom made screens measures 11520px x 980px.
Project made in Unreal Engine 4. Models created in Make Human and rigged in Motionbuilder. Modeled in UE4 and Maya. Performances captured using Motive and Opti Track.
Produced in collaboration with Chang Liu. Music by De Ke - "CHAOS". Dance solos performed by Michaela Rae Mann and Berit Ahlgren.
Shy Art Show is a proposal for an art exhibition displaying a collection of ‘shy’ objects that actively avoid being looked at. The goal of the exhibition is to force the viewer to become aware of their process of experiencing the artwork and take time to appreciate the objects on display without the distraction of mobile phones and other electronic devices.
Interactive plan of the exhibition space made in Unreal Engine 4.
nth wall is a multimedia performance exploring the intersection of the physical world and virtual reality. The project is a result of a collaboration between Joanna Wrzaszczyk, Nicholas Bratton, Zhen Liu, and Dylan Filingeri.
New technologies such as Oculus and augmented reality mobile apps have forced us to confront the digital and the physical -- to distinguish between the online and the offline. These two worlds often appear separate; the digital world is ‘virtual’ and the physical world is ‘real’. Through this project we try to merge the material reality with digital information and blur the boundaries between these two worlds.
The finished product is a platform for the exploration of the permeable borders between virtual and real. It is a virtual reality world that is set in a real space and allows for real life interactions with objects and people, enabling the user to physically cross from one world to the other. We use mixed-reality boundaries to understand and disrupt human perception and trigger new ways of making sense out of the world around.
Project made in Unreal Engine 4. Motion capture recorded with OptiTrack and Motive.
nth wall - world walk-through
The Nth Wall : Interaction between mocap performer & VR immersee
#nofilter, or how to find magical in the mundane is a video and photography installation that examines the differences between using filters as an image embellishments and as a function of narrative storytelling.
Unlike memory, photographs are simply a record of an event that has occurred at a certain time in a certain place. Instead of altering the picture after taking it using Instagram's digital filter I looked at reality trough a lens of my camera and a layer of a colorful Plexiglas thus changing the way I see the reality in front of me while taking my picture.
#nofilter, or how to find magical in the mundane is part of my B.F.A thesis at SUNY College at Purchase.
Encoding Mechanical Eye
Short experimental film made with a use of a microscope.