Nightsky
Nightsky I & II are interactive extended reality demos that explore the visualization of music and audience interaction through data-driven particle systems in Unreal Engine. The works employ music information retrieval (MIR) techniques to analyze audio signals in real time—extracting features such as spectral flux, amplitude, and rhythmic patterns—which directly influence visual parameters like particle spawn rate, velocity, and noise modulation. Audience participation is integral: mobile phones are used to transmit sensor data (e.g., accelerometer and gyroscope) to a central system, allowing real-time gestural input to affect the movement and behavior of particles within the immersive environment. This dual-input system—combining sonic analysis and embodied interaction—creates a generative space in which visuals are continuously shaped by both the music and the audience’s presence. NightSky I & II reflect an interest in collective experience, real-time data processing, and the fusion of audiovisual aesthetics through collaborative, responsive media environments.

