Verkkosivusto on kehitetty 360 LCI Innovaatiotoiminnankehittäminen-hankkeessa 2022-23.

Looks – an interactive installation by Klaus Kaufmann 

This installation was made for the course ”Experimental Working Methods” during my exchange semester at Lapland UAS. It explores the theme of perceiving and being perceived by technology and was created using web technologies, so it runs in the browser. To display the animation, I used shaders, which are special programs that help create visual effects, textures, and lighting in 3D scenes, for example. 

In the video Klaus is testing the installation. Video’s length is 2.46 min.  

After seeing multiple fluid simulations online, I wanted to use them as a starting point for this project. These simulations would calculate and render an effect that looks like ink in water and behaves dynamically like a fluid. I stumbled upon a great blog post by Misaki Nakano that explains how the shaders for the simulation are created in three.js, a JavaScript library primarily used for 3D graphics but also useful for creating 2D effects like these. 

Since the installation needed to be interactive and given that I could use this dual-projector setup, I tracked the user’s position and movement and used it as the input. A webcam was placed above the screen, looking at the space in front of it. Luckily, Google’s JavaScript library TensorFlow.js makes it fairly easy to implement the machine learning model PoseNet for real-time pose estimation. 

Although it wasn’t originally planned, I decided to add sound to the project. Initially, I used p5.js to generate the sounds but later switched to Tone.js. It allowed me to create digital synthesisers and filters that I could control with the parameters from the pose estimation and the animation. This means I was able to play different sounds whenever the viewer moves or something happens in the animation. 

I set up the project so that it would loop through multiple scenes and tried to give them a distinct look and behaviour. 

Three scenes are shown, with the first featuring a bunch of eyes that form a structure resembling frog eggs. Their gaze follows the viewer, and they switch positions between the two sides of the screen. Additionally, a menacing drone sound is played, and its intensity varies according to the movements of the eyes. 

In the next scene, the phrase ”Don’t look at me like that” is shown word by word. Once the phrase is complete, an image of the viewer’s face is displayed. Additionally, a heartbeat is played in sync with the animation pulsating. The pulsating gets stronger when there is no movement. 

In the last scene, when the viewer moves, a sound resembling a harmonica is played. A different chord is played each time the viewer starts to move. In addition, a wave-like structure covering the whole screen flows in the opposite direction of the viewer’s movement. 
 
 


Klaus Kaufmann is an exchange student in the Visual Arts degree programme.    

Tools explained: 

  • JavaScript is a programming language that can be used in web development.  
  • Three.js is used to show 3D content on a webpage. 
  • p5.js is a JavaScript library for creative coding. 
  • TensorFlow.js is a JavaScript library for training and deploying machine learning models. 
  • PoseNet is a machine learning model that detects a person’s real-time movements in an image or video. 
  • Tone.js is used to create interactive music in the web browser. 

Course: Experimental Working Methods 
Supervision: Pia Keränen, Jari Penttinen