Prototyping and tuning expressive interactions with TouchDesigner and motion sensors
This workshop introduces students to the creation of dynamic, body-movement-based interfaces for the real-time influence of sound and video.
Participants will explore how movement and orientation — from subtle shifts to full-body actions — can be mapped to digital media outputs in responsive and intentional ways. The workshop draws on mapping strategies observed in traditional instruments, as well as common signal processing techniques to inform the design of interactive digital systems. These principles will be approached through intuitive, hands-on exploration rather than formal theory.
Participants will learn to:
Use free and open-source tools to prototype interactive systems with the sensors already embedded in Android or iOS phones.
Build basic interfaces that connect physical input (e.g. motion, position, orientation) to sound and video.
Learn strategies to create generative audio and video and/or incorporate interactive systems into their own projects.
Develop an intuitive understanding of different mapping strategies
We will work primarily with TouchDesigner, using its visual programming environment (no coding necessary) to build interfaces and experiment with real-time mappings.
Technical Preparation Beforehand
Please bring your own laptop and smartphone for the workshop
Install Sensors2Osc for Android smartphone
and/or ZigSim for iOS (iPhone)
Install TouchDesigner on your laptop
Contact us if you encounter technical issues:
Important: If you doubt that your own laptop is not powerful enough for the workshop, please contact us.
---- Schedule ----
19.06 (Thu) 13:00 - 17:00
20.06 (Fri) 13:00 - 16:00
--- Location ----
Performance Class ter Heijne
Salzufer 13-14, 10587 Berlin
3. OG