Last semester, I had a chance to teach in Tangible Embedded Interaction course at MAU in Interaction Design with David Cuartielles, Simon Niedenthal, and Maliheh Ghajargar. For my section, the class of 40 bachelor students in design got a chance to make sounds supported by different input and output devices using machine learning. Building up from my work the DeepFlow project, I got a chance to work with the students for a week and using the good resources at Wekinator (Hats off to Fiebrink). The course was divided into 4 week-long workshops and then a 6-week project.
Great to teach design students in the studio format, in my opinion, computer science has a lot to learn from studio-based courses. Students used house plants, heart rate, brainwave, computer vision, and touch as the inputs. Students, connected Processing and Arduino together using Wekinator as the middle manager for the inputs and the output to different audio systems.