HOW ANIMALS SEE: INSIDE THE EXTRAORDINARY WORLD OF ANIMAL VISION
WORK IN PROGRESS
A web-based VR experience exploring the differences between human and animal vision by providing an artistic interpretation of how the world might appear to one of four animals: a bee, a dog, a bat, and a rattlesnake, as well as how the visual perception of each animal enhances their ability to find food or detect their predator or prey.
Triangulation and Image Processing
Project inspired by Delauney trianulation algorithm. The script uses Delauney Triangulation algorithm and Voronoi diagram to analyze photographs and convert them into triangulated images. Newly created images are then edited in Adobe Photoshop.
Code written in Processing language using open source toxiclibs and mesh libraries.
The project is an effect of a collaboration between Joanna Wrzaszczyk and Ross Goodwin. The lamp visualizes the traveling salesman problem between a set of 3D printed node-cities modeled in Rhinoceros software. City-nodes are designed based on descriptions of chosen cities from Italo Calvino’s book Invisible Cities.
The Traveler's Lamp is a three-dimensional set of vertices enclosed in an engraved laser cut Plexiglas form with a custom made wooden stand. The vertices are connected with a thin fishing line to LEDs, that visualize a computer algorithm (running on Raspberry Pi) approximating the traveling salesman problem in real time between the node-cities.
Mindful Lamp is a prototype of a 3D printed smart health appliance that can interpret user's emotion by accessing their social media accounts and analyzing the types of sentiments in their feeds. Project developed in collaboration with Jedy Chen, Martin Romero, and Stream Gao.
For most of the current smart health/health monitoring devices service, there is an unbalance between data input+analysis module and feedback module. That is to say, they focus on the solutions of how to get information and transform to readable data to customers, while lacking in what they can do with these readable data. Mindful Lamp server automatically collects emotion types from the user's social media and calculates the average 'sentiment' score (sentiment analysis, image recognition to get emotion values). The lamp reacts to the collected data by changing its color and shape in order to help the user meditate and improve his/her mental well being.
The ideal way to collect user's emotions is user-passive, unconscious, with the minimum user input. Based on these rules, we decide to use public feeds in social account and speech/tone analysis. There is also a way for user to input their mood directly. The lamp is using an Alchemy API to analyze emotions.
Arduino Yun serves as the general Mindful Lamp controller, controlling its wireless module (WiFi), lighting module, and movement module. The motion of the lamp is controlled by a combination of one big gear and five small gears, which are powered by a step motor.