Short project as part of a machine learning workshop at the Interactive Architecture Lab.
Visuals were created in processing, with simple ellipses drawn in a sine wave, with parameters mapped to different movements of the face.
openFrameworks face tracking code was used to track differences in facial movement, and Wekinator was used to map the movements to the processing sketch.