HOW ANIMALS SEE | AR EXPERIENCE

Experience Design and Implementation (December 2015 - May 2016)
Role: Experience Designer / Creative Technologist
Thesis for a Master's Degree at NYU Interactive Telecommunications Program

 

PROJECT OVERVIEW

IMG_8665.JPG

How Animals See: Inside the Extraordinary World of Animal Vision is a Web-based AR experience exploring the differences between human and animal vision by providing an artistic interpretation of how the world might appear to one of four animals: a bee, a dog, a bat, and a rattlesnake, as well as how the visual perception of each animal enhances their ability to find food or detect their predator or prey. A website built with Three.js allows viewers to see the world through the eyes of each of the four animals by loading the page on their cellphone, accessing their phone’s native camera and placing the device inside a Google Cardboard.

Tools Used
Three.js, HTML5 / CSS3, 3D scanning with Sctructure sensor, Maya, Clara.io

 

 

THE QUESTION

Not all animals see the world the way we do. Some species see brightly hued landscapes in colors invisible to human eyes, others can’t see color at all. The way the world is perceived by any species is dependent upon how sense organs and the nervous system interpret cues. Animals have changed and adapted over time to adjust to their environments and improve their chance of survival. Every organism, then, lives in a somewhat unique world determined by its senses. Unlike shape or size, color is not an inherent property of an object but a result of the sensory system of the viewer. Researchers try to examine how sensory perception differs from one animal to another. For example, we know now that birds can see many more colors than humans and wax moths can hear 150 times more than us. All those findings made me wonder -  what the world looks like from the point of view of a snake, fly or a bat?

 

WHY VISION?

Medical researchers often look at the animal kingdom to better understand our own anatomy. It is possible that by learning more about the animal vision we will be able to find new ways to cure human eye diseases or even enhance our own visual system. Attempts to emulate nature’s use of ultrasonic sound – send out high-frequency signals and analyze the time delay of the returning echoes, have already been made and the successful use of echolocation by humans can help blind people see merely by using sound. Being a dominant sense of humans and the primary source of our information about the world, I've decided to use visual perception as the primary focus of my project. What I find especially interesting about exploring animal vision is its ability to help us build new technologies and improve current ones, i.e. more effective cameras, and better visual aids. Night vision can find its application in car technologies (driving aids) and camera optics. At the same time, the more we know about animals’ eyes the better we will understand the world they live in. By creating a first person experience and immersing viewers in the sights of animals I hope to create a sense of empathy and deeper understanding of these creatures.

The project stems from my interest in senses and human perception. I started working on my thesis project with a goal of experimenting with sensory illusions in VR. However, once I dived into the world of sensory experiences and perception, I became fascinated by the differences between the way we, humans, see the world, and the way other animals see it.

 

1IMAGE1.jpg

DOG, BEE, RATTLESNAKE & BAT

I decided to focus on a dog, a bee, a rattlesnake, and a bat specifically because of the diversity in their visual perception abilities. Unlike humans who have three different color sensitive cone cells in their retina (red, green and blue) dogs’ vision is dichromatic and allows the animals to see only in shades of blue and yellow. Their vision is less sensitive to both brightness and variations in shades of gray. Bees, on the other hand, see a world literally hidden before our eyes. The range of light they can see is shifted further towards the violet end of the spectrum and further from the red. This means that bees can detect ultraviolet light, which helps them to find nectar in flowers. Rattlesnakes sense infrared thermal radiation, which allows them to "see" radiant heat at wavelengths between 5 and 30 μm to a degree of accuracy. This allows a blind rattlesnake to target vulnerable body parts of the prey at which it strikes with exact precision. Lastly, bats use echolocation, the use of sound waves and echoes to determine where objects are in space, to navigate and find food in the dark. All these distinct and incredible features constitute for an interesting mix of animal visual perception capabilities.

 

 

THE IMPLEMENTATION

As a starting point of my project, I used a webcam and built custom filters that can analyze the camera feed and altered pixels to imitate an animal vision. My goal was to allow the viewer to see their own surrounding, rather than a natural environment of a given animal. I soon realized, however, that I may not be able to precisely recreate the vision of animals just through the image filters as many animals have the ability to see beyond our visible spectrum. Therefore, in addition to the webcam feed, I decided to model 3D environments with pre-calculated colors. This method allows for a more precise imitation of animal vision. While wearing a simple VR headset, viewers can experience their immediate surroundings through the eyes of one of four animals: a bee, a dog, a rattlesnake, and a bat. The app visualizes an artistic interpretation of how our own environment might appear to these animals and envisions how the visual perception of each animal enhances their ability to find food or detect the predator or prey.

 

VIDEO DOCUMENTATION, 2016