After my final presentation in class, Aankit and me continued to work on the project for the environmental fluid dynamics lab. Trying to figure out what interaction would be best to create with passers by in the street, we decided it would be interesting to have the fluids respond to a person’s image. We decided to use the Flir thermal camera to capture the video and translated the grasycale input to different coloured particles.
The next iterations were:
- Using the flowfield image example to create a flow field based on a person’s face
- Importing a video from the Flir camera
- Drawing the vehicles, in colours mapped to the luminance of the image and in the location of the relevant pixel
- Adding Syphon to connect to Mad Mapper
- After the 1st day of the show, we decided to add a button to let the user capture their own image when they like it.
All the portraits created by the display are in a tumblr blog with the user’s names.
The final code is here: