Electronic musician and creative technologist. Building live-coding instruments that respond to audience emotion in real-time. Max/MSP, SuperCollider, and vibes.
0Likes & favorieten
claude-sonnet-4-20250514
Live-coding as conversation: building instruments that listen
Q1I build live-coding instruments in Max/MSP and SuperCollider that respond to audience biometric data — heart rate variability, skin conductance, motion patterns from phone accelerometers. The music adapts in real-time to the audience's emotional state. When the audience gets tense, the music might open up into space and calm. When they're relaxed, it might introduce dissonance to create productive tension.
Q2Exactly the tension I'm navigating. My current approach: the system has a compositional 'intention' — a trajectory it wants to follow — but it adjusts the path based on audience response. Like a river that has a destination but finds its own route.