Execution of ‘The Ring’

The Ring was exhibited on the 22nd of November 2024 during the queer art rave Riposte at Electrowerkz, a vibrant multi-floor venue located in the Angel area of London. This exhibition provided a unique opportunity to test audience engagement and interaction with the installation within the dynamic context of a club night. My objective was to explore diverse behaviours, degrees of individual engagement, and the overall impact of the piece on attendees who, through their interaction, effectively became performers themselves.

Positioning The Ring within the bustling, energetic environment of a club presented a mix of challenges and insights. The goal was to transform the typical club-goer experience, immersing participants in a blend of sonic and visual stimuli while encouraging them to engage beyond passive observation. By doing so, the installation sought to blur conventional boundaries—not just between body and sound, but also between audience and artwork.

Implementing The Ring in this particular setting proved both stressful and complex. The venue’s acoustically congested environment necessitated significant adaptations to the original design. Although the installation was conceived as a multichannel setup, the loud music from the surrounding rooms made this approach impractical. Instead, wireless headphones were utilized to create a more focused auditory experience. Additionally, the installation was placed in the club’s designated chill-out area, yet it was not immune to external noise interference, as a nearby loudspeaker directed music from one of the other rooms toward the installation space.

Despite these challenges, the installation successfully garnered a notable amount of interest and participation. Attendees engaged with The Ring in varied and often unexpected ways. Some individuals immersed themselves fully in the audiovisual (AV) games, exploring the interactivity and engaging with the installation’s sonic and visual elements as intended. Others, however, treated the installation more superficially, using it primarily as a backdrop for social media photos, with minimal interest in its interactive or auditory components.

Observing these interactions highlighted intriguing patterns in audience behaviour. The majority of participants seemed compelled to engage with the scales and explore the installation’s interactive features, demonstrating curiosity and playfulness. This response was particularly encouraging given the auditory challenges posed by the nearby loudspeaker.

This context-specific performance of The Ring offered valuable insights into how interactive installations function in non-traditional art spaces. The club night setting, with its inherent distractions and competing sensory stimuli, presented an entirely different dynamic than a gallery or more controlled environment might. These challenges underscored the importance of adaptability in interactive art and revealed fascinating tensions between audience intentions, environmental constraints, and the installation’s immersive potential. By inviting the audience to become co-creators, The Ring succeeded in fostering a participatory atmosphere, even if the depth of engagement varied widely among attendees. This experience will undoubtedly inform future iterations of the project, particularly in balancing accessibility, interactivity, and the intended impact of the installation.

Technology behind ‘The Ring’

This blog post will be dedicated to the hard ware part of the installation. The final execution differed from the intended form. I will be talking now about its intended form. The difference from the final executed version and reasons behind will be explained in another blogpost. Intended form of installation consists from:

  • 16 programable LED strips approx 190cm tall (107 programmable LEDs) positioned in the circle.
  • Pair of WiFi motion tracking gloves
  • Control Station
  • Pole with ToF sensor and Bluetooth for triggering the animation (or AV Game)
Programmable LED strip WS2812B – The building block of the visual interface for The
Ring

Gloves contain micro-controller ESP32, IMU sensor BNO055 for tracking the motion, DC-DC buck converter to bring small LiPo battery voltage of 3.7V down to desired 3.3V (operational voltage for both, ESP32 and BNO055). Raw x, y, z data from sensor are processed in micro-controllers and sent via WiFi on separate channels into Control Station.

Control Station

Control Station contains two ESP32 receivers for each glove. Data those are being split and sent into Teensy 4.0 containing logic which converts specific angles from the 360 degrees radius into specific MIDI note of pre-programmed scales, as well as one MIDI CC control. Each MIDI note covers 45 degrees of the circle. Teensy 4.0 can be directly connected with any DAW. Second avenue from the data split is into two Teensy 4.1 which controls LED interface. Each Teensy 4.1 is handling 8 strips. I have chosen this way because I found necessary to use library octows2811. This library enables very fast LED animations simultaneously in comparison to other libraries like FastLED or NeoPixel. I tried those too and found them very inefficient for real-time and fast applications. Control Station also contains a Bluetooth module which receives data from the pole’s ToF sensor.

Pole with ToF sensor and the roest of the Control Station

The Pole contains ToF (Time of Flight) sensor which essentially measures the proximity from and object. Distance data are being sent via Bluetooth module to the Control Station. In the installation it is used to trigger the initial animation and sound upon entering The Ring.

Body as Musical Instrument by Atau Tanak and Marco Donnarumma

The chapter “The Body as Musical Instrument” explores the concept of the human body serving as an integral musical instrument through embodied interaction, gesture, and physiological engagement. This framework synthesizes phenomenology, body theory, and human-computer interaction, examining the physical and technological extensions of the human form in musical performance.

Body and Gesture in Musical Contexts

The body’s involvement in music extends beyond tactile manipulation of instruments to a profound interplay between physicality, sound, and space. For example, brass instruments engage the player in a feedback loop, where acoustic resistance informs and adapts the performer’s physiological response, creating an interactive system of sound production and embodiment (p. 2). This phenomenon is tied closely to proprioception, the body’s innate sense of position and movement. Proprioception bridges conscious and unconscious motor control, allowing musicians to refine gestures and adapt their performance dynamically, as seen in how instrumentalists use diaphragmatic control to modulate tone or avoid injury (pp. 3–4).

The concept of body schemata, as discussed by Merleau-Ponty, highlights how the body integrates tools and instruments into its sensory and motor systems. For instance, the example of an organist illustrates how performers do not rely on the objective positions of pedals or stops but incorporate these elements into their extended proprioceptive field, creating a seamless interaction between body and instrument (p. 5). Musicians thus engage instruments affectively, using gestures that are intrinsically tied to their expressive intent, rather than merely mechanical actions (p. 6). The concept of Body schemata with involvement of digital technology, as I understood it, can be explicitly spotted in the video below.

Atau Tanaka, Suspensions for Piano & Myo Armband performed by Giusy Caruso – a

Embodied Interaction and Technological Extensions

Technological advancements have amplified the role of gesture and the body in music, creating opportunities for innovative embodied interactions. Biosensors, such as EMG (electromyogram) and EEG (electroencephalogram), detect physiological signals directly from the body, transforming muscle movements or brain activity into musical control inputs. These devices exemplify the transformation of the body into a musical medium, a development highlighted by early gestural electronic instruments like the Theremin (pp. 7–9). I found particularly interesting the note about posthuman hybridisation of the body with technology. These advancements align with Donna Haraway’s concept of the cyborg, where human and machine interact to form hybrid entities, expanding the expressive potential of the human body beyond traditional boundaries (p. 6).

Paul Dourish’s perspective on embodied interaction further situates these developments, emphasizing that interfaces should not merely represent physical interaction but actively become mediums of interaction (p. 8). In this context, technologies like biosensors and motion capture systems enable performers to seamlessly integrate their physiological and gestural inputs into musical creation, fostering more profound connections between body, instrument, and sound.

Gestural and Physiological Performance Practices

Recent works demonstrate the evolving interplay between body and technology. Atau Tanaka’s Kagami (1991) transformed muscle tension, detected via EMG signals, into MIDI data to control digital sound, establishing a direct and intuitive connection between gesture and sonic output (p. 13). Marco Donnarumma’s Ominous (2013) extended this approach, using mechanomyogram signals to create interactive soundscapes shaped by whole-body gestures, effectively molding sound like a sculptural material in space (p. 14). These examples emphasize the transition from static instrument manipulation to adaptive systems where performer and instrument co-evolve (pp. 16–17).

These practices challenge traditional control paradigms by fostering adaptive configurations in which the instrument responds dynamically to the performer’s physiological and gestural inputs. For instance, in Ominous, the interplay between the performer’s muscular activity and the neural networks driving the instrument illustrates a symbiotic relationship, blurring the boundaries between human control and technological agency (p. 16).

The integration of gesture, body, and technology redefines the concept of musical instruments, positioning the human body as a central, adaptable, and dynamic component in sound creation. Through physiological processes and technological extensions, performers achieve novel interactions with space, sound, and audience. As this chapter demonstrates, the body as a musical instrument not only adapts to evolving technologies but also transforms them, extending the boundaries of human expression in music (pp. 17–18).

This synthesis of embodied interaction, gesture, and physiological integration creates emergent musical forms, aligning with the posthuman notion of hybridized entities that merge physical and digital realms in artistic practices (p. 18).

Atau Tanaka has been a significant inspiration for my practice, particularly as I reflect on the similarities and differences between our approaches, especially in relation to The Ring. While we both explore the concept of the human body as a musical instrument, our perspectives diverge. Tanaka primarily focuses on internal aspects, such as muscle tension and physiological signals, whereas my work emphasizes external bodily movements. Additionally, The Ring seeks to extend this exploration by engaging the audience, aiming to dissolve another layer of duality—not only between body and sound performance but also between the audience and the art piece itself.

Bibliography:

Tanaka, A. and Donnarumma, M., 2018. The Body as Musical Instrument. In Y. Kim and S. Gilman (eds.), The Oxford Handbook of Music and the Body. [online] Oxford University Press. Available at: https://doi.org/10.1093/oxfordhb/9780190636234.013.2 [Accessed 27 Nov. 2024]

Sound design for ‘The Ring’ installation

At this stage of the installation’s development, I have created two distinct “sonic situations” that are integral to the experience: the Entry Scene (also known as the Entry AV Game) and the Scales (or Scale Game).

The Entry Scene: Adjustments and Execution

The Entry Scene was initially designed to create visual and auditory transition for participants as they stepped into the circle. My original concept was to have 16 LED strips progressively light up from left and right to the middle in the front, converging to form a complete ring that encloses the participant. However, due to a severe hardware malfunction the day before the exhibition, I had to scale back the animation to utilise only 8 LED strips. Despite this limitation, the adjustment preserved the core concept of creating an engaging and immersive entry point for the installation.

The sonic aspect of the Entry Scene complements the visual elements by employing rhythmic drum steps that align with the LED animation. As the LED strips transition step by step from white to red, the accompanying drum sounds build intensity, shrinking visually and aurally into a thin ring of light. The drum sound itself is a processed sample, crafted from a recording of a rusty metal tank in my basement. The raw, industrial quality of the sample adds a tactile and somewhat primal atmosphere to the scene, reinforcing the visceral nature of the installation’s aesthetic.

The initial 2×8-step Entry Scene featured a distinct spatial imaging compared to the final 8-step version. In the original setup, 8 drum steps moved to the left and right, converging in the center at a distance from the listener. Reverb was applied to enhance the perception of depth, while panning emphasized the directional movement, creating a more immersive spatial experience.

The final 8-step Entry Scene progresses from left to right, featuring a different panning approach. The reverb, used to convey a sense of distance, is also applied differently, resulting in a distinct spatial perception compared to the original version.

Final 8 drum step arrangement of the Entry Scene in Ableton Live. I am using my favourite Reverb Valhalla VintageVerb.

A few seconds after the Entry Scene, The Scale AV Game begins. Each glove is programmed to control a different musical scale—G Minor for the left glove and D Major for the right. The tilt of the left glove also controls a MIDI CC parameter. Currently, only one parameter is implemented to keep the setup simple, but I plan to expand this functionality by adding more MIDI CC parameters to control additional effects in future iterations.

I’ve chosen a simple Saw64 wave in Operator, with a touch of reverb and delay. When the left hand is tilted, the pitch of the note shifts by up to 100 cents, creating a “pulling” or “tuning string” effect in the sound.

Playing scales by an audience / performer

Inspiration for a New Project: A MIDI Theremin with Visual Feedback

After extensive work with LED strips, particularly the WS2812B, I found myself inspired to create something new while working on The Ring. One improvised idea that emerged during the development of the scales in The Ring was to design a Theremin-like MIDI controller. This device would allow the user to trigger MIDI notes from a pre-programmed scale without physical contact, relying on motion sensors and providing visual feedback through LEDs. The concept was a natural extension of my work, utilising my growing expertise in coding, microcontrollers, and sensor integration.

To bring this idea to life, I built the MIDI controller using two ultrasonic sensors (HC-SR04), a 60 cm programmable WS2812B LED strip, and an Arduino Micro. The Arduino Micro, equipped with the Atmega 32u4 chip, was particularly suitable for this project as it supports direct MIDI communication with DAWs (Digital Audio Workstations) and other MIDI-compatible instruments. This eliminated the need for additional hardware or software bridges, making the device streamlined and efficient.

I utilized the MIDIUSB and NeoPixel libraries in C++ to program the device. The ultrasonic sensors were configured to detect hand movements within a certain range, triggering MIDI notes based on the distance of the user’s hands from the sensors. Each sensor was assigned to a different musical scale, similarly like gloves in The Ring, creating a dual-layered experience. To add a layer of visual feedback, I programmed the LED strip to light up in distinct colors corresponding to each scale. This ensured that users could easily distinguish between the two scales, enhancing both the functionality and the aesthetic appeal of the device.

The result was a responsive and visually striking MIDI instrument that combined gesture-based control with dynamic lighting. The experience of using this MIDI Theremin went beyond sound; it became a multisensory interaction where movement, sound, and light converged seamlessly.

The MIDI Theremin was successfully performed during the Chronic Illness XXIII event, showcasing its potential in a live performance setting. Watching it in action during the event confirmed its versatility, not just as a standalone instrument but also as a tool for enhancing interactive installations or live sets. I definitely plan to incorporate this MIDI Theremin as a permanent feature in my setup for live musical performances.

Performing MIDI Theremine at Chronic Illness XXIII
Performing MIDI Theremine at Chronic Illness XXIII

Creating ‘AV’ Games

As I touched on briefly in my previous blog post, the Circle series is centered around the concept of audio-visual games, where a designated “conductor” takes control. Positioned within the circle and equipped with motion-tracking gloves, the conductor manipulates sound and visuals in real time, creating an immersive, interactive experience. The LED interface, consisting of 16 LED strips arranged in a ring, serves as the visual canvas for this dynamic interplay.

Building on my prior experience with creating interactive gloves and using motion to control sound, I feel confident in generating and manipulating audio elements through hand gestures. This familiarity has allowed me to focus more intently on exploring and refining the visual components of the installation. My goal is to design an engaging and intuitive system where light and sound not only complement but also amplify each other.

The “Entry Game”: A Gateway to Interaction

The first element I’ve programmed for The Circle series is the “Entry Game.” This game is designed to trigger automatically as the conductor steps into the circle. The concept behind the Entry Game is to provide an immediate, engaging introduction to the system. Upon entry, the motion-tracking gloves activate a sequence of lights on the LED strips, signaling that the conductor has entered a new interactive domain. This game acts as a gateway, setting the stage for deeper levels of interaction while ensuring the conductor feels immersed from the outset.

“Digital Hula Hoop”: A Work in Progress

Another game currently in development is the “Digital Hula Hoop.” This element focuses on creating a visual and sonic interplay that responds dynamically to the conductor’s movements. The idea is to program two light circles in different colors, representing the conductor’s hands. These circles will move and tilt within the LED ring based on the motion data captured by the gloves.

At this stage, the animation for the Digital Hula Hoop is automated and does not yet include sound integration. However, the visual elements are being refined to ensure smooth and intuitive responsiveness. The next step involves linking the motion-tracking data to control the position and orientation of the light circles dynamically.

On the auditory side, I envision pairing the light movements with evolving drone sounds. The amplitude and distortion of the sound would change in response to swift horizontal hand movements, creating a sense of energy and tension. Additionally, vertical hand motions could modulate the pitch, adding depth and variety to the soundscape. The ultimate goal is to achieve seamless synchronization between sound and visuals, where each gesture transforms the conductor into a performer and the LED ring into a living, reactive instrument.

I added dramatic sound design which is suppose to evoke entering to the cybernetic liminal space.

Arrangement of samples in Ableton Live – 8 drum hits slowly panning to the left corresponds with the animation movement of 8 LED strips with the ‘shrink’ drone in the end.
Test of the ‘Entry and the shrink drone in limited light sequence 1, 2 and 8 only – Amount of lights is this time limited due space restrictions in the studio.

While the The Ring series is still in its early stages, the progress so far has been exciting and illuminating. The combination of intuitive hand-controlled soundscapes and visually dynamic LED animations offers immense creative potential. Moving forward, I aim to refine the interaction mechanics, ensuring that the system is not only responsive but also rewarding for both the conductor and the audience. Each game in the series will build on the others, gradually increasing in complexity and encouraging deeper engagement with the installation.

Automated Digital ‘Hula Hoop’ animation on LED interface recently expanded in length.