Category Archives: Expanded Studio Practice for 21st Century

Inspiration for a New Project: A MIDI Theremin with Visual Feedback

After extensive work with LED strips, particularly the WS2812B, I found myself inspired to create something new while working on The Ring. One improvised idea that emerged during the development of the scales in The Ring was to design a Theremin-like MIDI controller. This device would allow the user to trigger MIDI notes from a pre-programmed scale without physical contact, relying on motion sensors and providing visual feedback through LEDs. The concept was a natural extension of my work, utilising my growing expertise in coding, microcontrollers, and sensor integration.

To bring this idea to life, I built the MIDI controller using two ultrasonic sensors (HC-SR04), a 60 cm programmable WS2812B LED strip, and an Arduino Micro. The Arduino Micro, equipped with the Atmega 32u4 chip, was particularly suitable for this project as it supports direct MIDI communication with DAWs (Digital Audio Workstations) and other MIDI-compatible instruments. This eliminated the need for additional hardware or software bridges, making the device streamlined and efficient.

I utilized the MIDIUSB and NeoPixel libraries in C++ to program the device. The ultrasonic sensors were configured to detect hand movements within a certain range, triggering MIDI notes based on the distance of the user’s hands from the sensors. Each sensor was assigned to a different musical scale, similarly like gloves in The Ring, creating a dual-layered experience. To add a layer of visual feedback, I programmed the LED strip to light up in distinct colors corresponding to each scale. This ensured that users could easily distinguish between the two scales, enhancing both the functionality and the aesthetic appeal of the device.

The result was a responsive and visually striking MIDI instrument that combined gesture-based control with dynamic lighting. The experience of using this MIDI Theremin went beyond sound; it became a multisensory interaction where movement, sound, and light converged seamlessly.

The MIDI Theremin was successfully performed during the Chronic Illness XXIII event, showcasing its potential in a live performance setting. Watching it in action during the event confirmed its versatility, not just as a standalone instrument but also as a tool for enhancing interactive installations or live sets. I definitely plan to incorporate this MIDI Theremin as a permanent feature in my setup for live musical performances.

Performing MIDI Theremine at Chronic Illness XXIII
Performing MIDI Theremine at Chronic Illness XXIII

Commission for motion tracking gloves

The inspiration for The Ring began in an unexpected and somewhat serendipitous way. I came across a striking visual of an unknown installation while scrolling through social media. The image captured my imagination, and I immediately thought, “I’d like to replicate something like this.” What began as a visual exercise—a simple attempt to recreate the aesthetic appeal of the installation—soon evolved into a much more ambitious project. As I delved deeper, I realized the potential to expand the concept by incorporating sonic and interactive elements. These additions aligned with my broader interest in creating immersive, multisensory experiences.

The idea began to take form after a shift in priorities around a separate commissioned piece I was working on at the time. Suddenly free to explore my own creative directions, I decided to use this opportunity to build upon the initial inspiration. What started as a purely visual experiment grew into an exploration of audience interaction, embodiment, and the integration of sound and motion.

In September, my friend Matteo Chiarenza Santini approached me with an intriguing request. Matteo was collaborating on a live performance for FKA Twigs and had been tasked with sourcing a pair of simple, interactive motion-tracking gloves for the performance. He reached out to me, asking if I could create a prototype that would meet the technical requirements.

Excited by the challenge, I began working on the gloves. Using my experience with Arduino and similar technologies, I designed a simplified version of an earlier prototype. The gloves featured BNO055 IMU sensors for precise motion tracking and ESP32 microcontrollers for data collection and Wi-Fi transmission. Each glove was capable of sending raw x, y, z axis motion data to a Teensy board, which interfaced with Max For Live, enabling users to control parameters in real-time. Additionally, the gloves supported direct MIDI communication, making them compatible with Ableton Live and other DAWs.

Although the gloves were completed, they were ultimately not used in FKA Twigs’ performance. Initially, this was disappointing. However, the experience of building the gloves and the creative potential they represented sparked a new wave of ideas for me. What if these gloves became the foundation for the interactive installation I had been contemplating? The thought of integrating motion-tracking gloves into an installation seemed like the perfect opportunity to explore the interplay between movement, sound, and interactivity on a deeper level.

This unexpected twist marked a turning point in the development of The Ring. The gloves became the starting point for an installation that not only reflected my fascination with visual aesthetics but also pushed me to explore how movement could shape soundscapes and create immersive environments. What began as a technical experiment transformed into a project driven by the potential to blur boundaries between performer, audience, and artwork.

Sonokinetic gloves 2.0 – Laying out ideas for improvements

Creating the prototype of Sonokinetic gloves was an important step into figuring out all possibilities and limits of wireless wearable MIDI controllers. After further research and obtaining new knowledge about digital technologies, I decided to build new gloves, which would be more stable, faster and accurate within its performance. Crutial elements of improvment are switching to WiFi from Bluetooth and utilising more electronic textiles. Making the whole piece of gloves from the scratch seems now as more logical outcome.

Sonokinesis performance and collaboration in Czech Republic

For the composition I would like to create something using the wireless cybernetic MIDI controller, which i have been developing from September 2023. Since then I achieved various advancements and figured out a lot of flaws.

Last week I travelled to Czech Republic and I joined two collaborations. I wanted to experiment with, how it is to play with other musicians. Sonokinesis suit and gloves are at this stage simply MIDI controllers. They don’t generate any sound itself but only wirelessly control parameters within the DAW Ableton Live.

In Brno I have performed on 9th May 2024 at Rello Il Torrefattore, Rybarska 13, Brno. The venue is primarily coffee roasterty but it resides in the same building as Bastl Instruments, worldwide renown developer of Eurorack modules. A lot of experimental musicians and sound artists naturally gravitates towards this quite hidden and not very known venue.

We have built improvised octophonic ring in the garden and two friends of mine, Valentino and Jakub, were playing their modular systems and sending me two stereo signals. Movements of my limbs, head and fingers were effecting and modulating their audio signals and overall we have created improvised noise-ambient spatial soundscape.

Sonokinesis Performance at Rello Il Torrefattore, Brno, Czech Republic

During the performance I have realised the importance of spatial setup for the project. It could certainly be used in stereo or in ambisonics, however I believe that since the body exists in 3D space, the whole performance should exist in the space sonically as well. Therefore I will aim for spatial setup as much as it will be in future possible.

Sonokinesis Performance at Rello Il Torrefattore, Brno, Czech Republic

Next performance took place in small bar KMart in Ostrava on Friday 10th May. This time I joined with my cybernetic wireless MIDI controller noise and techno musician Jeronym Bartos aka Laboratorium Expiravit and violinist, vocalist Samuela aka Soleom.

The situation was fairly similar. They were sending me their audio signals and I was modulating them with the movement of my body, this time in quadrophonic setup. Due to nature of their musical taste we have created an improvised dark ambient soundscape.

Sonokinesis Performance at KMart, Ostrava, Czech Republic

At this stage I have been thinking how different is this approach from what I have been used to up until now. My body is becoming musical instrument in the sphere of electronically produced live music. I have never been a dancer however during both performances I have been acting and then also reacting back on sonic situations which emerged. From the choreographic point of view I have no idea how the performance could be perceived as I feel I am in the stage getting to know my new instrument and also getting to know what my own body can do in the sonic context.

Sonokinesis Performance at KMart, Ostrava, Czech Republic

Work-in-progress I

At my end I have started to build the ‘Core Station’ which will process data from all sensors and transmit them via WiFi. It will be attached to the back of the performer. Based on the research I did, I have decided to upgrade the micro-controller from the previous prototype based on ESP32 to Teensy 4.1. ESP32s are still being used but only for the WiFi connection. Teensy 4.1 contains microprocessor ARM Cortex-M7 clocking at 600 MHz which is in comparison to ESP32’s microprocessor Tensilica Xtensa LX6 clocking at 240MHz significant improvement allowing fast and real-time data transfer from multiple sensors in the same time.

Teensy 4.1

Teensy will be gathering data from two accelerometer sensors GY-521 (MPU6050) attached to the feet, two elbow and two knee stretch sensors and BNO055 (9-DOF absolute orientation sensor) which will be situated in the headpiece. Data from sensors are going to be sent via UART connection into ESP32 WROOM-32U Access Point. I have been considering SPI connection but I struggled to find appropriate libraries at Arduino IDE and learnt that it will require to learn different IDE. I have tested UART which I am familiar with and it proved itself sufficient enough, however I still consider sorting out SPI connection in the future.

On the receiving end there is another ESP32 WROOM-32U which is connected to the computer and sends raw numerical data to the Max MSP. ESP32 WROOM-32U has a specific feature – possibility to attach external WiFi antenna. This significantly improved the data transmission and range.

ESP32 WROOM-32U with the antenna.
Prototyping the Core Station on the breadboard – Teensy board, ESP32 Access Point device (Sender) and IMU sensors.
ESP32 Client device (Receiver)
Testing the speed of data transfer
Testing the range of the WiFi connection

Emerging team

I have been seeking for a while a fashion designer or maker, someone with appreciation of similar aesthetics as I do therefore the result could become a common effort rather than an order.

After discovering the conductive rubber I realised that it could be efficiently combined with the latex. I approached a friend of a friend, latex maker and designer Ella Powell @exlatex.

Ella Powell have been working with creating latex clothes and sheeting for the past two years. She studied a short course in latex making at Central Saint Martins over the summer 2022. Currently she is studying a master’s degree in computer science and AI.

After initial meeting we have drafted some ideas about creating latex based organic-like futuristic looking sensors which will efficiently collect data from the bending knees and elbows. Below you can see AI generated idea of the direction in which the piece might be evolving.

Other artists which are joining the team are @Elixir31012. Elixir31012 is a multimedia artist duo formed by digital artist Jing Xu @abadteeth and sound artist Jiyun Xia @xiaji2tsunagi2 in 2023. Both graduated from the Royal College of Art with a degree in Fashion in 2023. Elixir31012 creates an otherworldly landscape of non-linear time through digital animation, experimental music making, wearable technology, and performance. Cyborg study, myth, ritual, and feminist narratives are intertwined in their work. Elixir31012 takes its name from the Chinese Taoist vocabulary for “elixir”. 3 represent “Synthesis”, 10 for the “Sacred”, and 12 for the “Eternal Return”. Their sound art performance at the the event Chronic Illness XX very intrigued me and we started talking. The idea to collaborate emerged very soon and organically based on similar interests in creative technology, cyborg art and sound art. Elixir31012 proposed that they will make a headpiece which would carry the motion sensor for the Sonokinesis performance.

Elixir31012 performing at IKLECTIK Art Lab
Elixir31012 performing at IKLECTIK Art Lab

Declan Agrippa @hiyahelix, student of the second year of Sound Arts at University of the Arts London, London College of Communication, is going to create a sound design using the virtual wavetable synthesiser Serum.

Below you can see the work in progress sketches of the sensor headpiece in Zbrush.

This image has an empty alt attribute; its file name is IMG_6456.jpg

Multi-disciplinary and kinaesthetic artist Ona Tzar @ona.tzar is joining the team as a performer. Her creative input is being very important for developing the whole system because we would like to have the garments as ‘user friendly’ as possible. We have been actively discussing materials, positions of sensors, shape of garments and the headpiece trying to find the right balance between ‘experimentalist aesthetics’ whilst keeping the usability of all pieces for performance comfortable, functional and reliable.

Sonokinesis: Part II – Drafting ideas for the collaboration

Last term I have introduced foundations of the project Sonokinesis – the idea of controlling the sound by the movement and other aspects of human body. I have made a pair of wireless interactive Arduino based gloves which allows to control the sound in visual programming language Max MSP and map them into Ableton Live via Max For Live. The piece has been performed so far on two occasions, at the Gallery 46 in Whitechapel and at the underground event featuring performance art and experimental sound Chronic Illness XXI. During those performances I have revealed many flaws which occurred and started to troubleshoot and upgrade the project – mainly unstable Bluetooth connection and the significant fragility of assembled pieces.

The idea of Sonokinesis certainly doesn’t stop at the pair of Arduino gloves and I aim to develop more stable and durable version of gloves followed by other garments allowing to the performer to encompass other parts of their body.

I have been experimenting with flexible pads for knees and elbows and created simple accelerometer based headpiece triggering MIDI notes or samples. All those are connected to the central mini-computer attached to the lower back with the belt. Central mini-computer is this time based on different micro-controller ESP32 Wroom-32 and wireless connectivity is sorted with WiFi connection which proved itself more stable and faster than Bluetooth.

Assembling wearable station ESP32 Wroom-32
Headpiece carrying sensor MPU6050 (accelerometer and gyroscope)

For knees and elbows I firstly assembled wearable pads based on the same flex sensors which I used for fingers of gloves. Unfortunately they appeared to be highly un-efficient when it comes to durability. Their function was limited by fragility and sensors started to break and rip after even single use which needs to be avoided at any cost since the piece must remain stable during performance and reusable. Also the cost of flex sensors is quite high considering its fragility (about £15 for one sensor).

Not long ago I have discovered conductive rubber which changes its conductive properties based on the stretch. I have tested a strip cut from the sheet attached to the knee pad and it proved itself very efficient, durable and in comparison to flex sensors also way more cheaper.

A strip cut from the sheet attached to the knee pad changing its electric resistance based on the stretch applied by the knee bending.

Troubleshooting and rethinking various types of Bluetooth connection

Previous way of transferring data from Arduino to computer with Bluetooth turned out to be not effective and, in fact, worked only once. I couldn’t figure out why I wasn’t able to connect Arduino and laptop via HC-05 ever again. I started to research about the history of Bluetooth and its various types over years and resulted to Bluetooth connection based on Central and Peripheral (Master and slave in older terminology) devices.

I kept experimenting with module HC-05, but as I learnt, it is not ideal for every application. It is an older technology based on Bluetooth 2.0, which was introduced on market in 2005, consumes more energy and generally is slower than Bluetooth 4.0 aka BLE (Bluetooth Low Energy) introduced on market in 2010. HM-10 is BLE module which I experimented with until I discovered Arduino Nano 33 BLE which has, as the name suggests, Bluetooth Low Energy built in the microcontroller itself. Arduino Nano BLE also has built in sensor LSM9DS1, which has accelerometer, gyroscope and magnetometer similarly like BNO055.

I experimented with LSM9DS1 as well but I realised that for my purpose I will need to use BNO055. Why I decided so? BNO055 is able to produce Euler angles absolute orientation data based only on one syntax within the code. I didn’t find anything like that at LSM9DS1. I found out that it is possible to program LSM9DS1 with an algorithm based on specific equations in order to obtain Euler angles values, but implementing them into the functional C++ code goes currently beyond my beginner’s abilities. However it is definitely something which I will have look into in near future, because getting Euler angles values from an in-built sensor means overall optimisation by getting rid off potentially redundant external sensor.

For now, I stuck to the BNO055. I have found codes for Peripheral and Central devices for Arduino Nano 33 BLE on Github and modified them to get Euler angles values from BNO055.

CODE FOR PERIPHERAL DEVICE:

CODE FOR CENTRAL DEVICE:

The result of these codes is getting x, y, z data from BNO055 connected to the Peripheral Arduino device and sending them via BLE into Central Arduino device, which is connected to the computer and prints pure x, y, z numerical values of Euler angles in serial monitor.

Getting numerical values into serial monitor in the specific format and speed will allow me to work with them further in Max For Live. I have created a Max device which is mappable to anything in Ableton Live but with a specific focus that this device in particular will control Surround Panner connected to the octaphonic ring.

Connecting BNO055 sensor with Arduino and sending data to the computer via Bluetooth

The other week, I progressed with connecting to the Arduino Nano with BNO055 via Bluetooth module HC-05. As described in a previous post, BNO055 is an accelerometer, gyroscope and magnetometer in one device. I intend to implement BNO055 into the glove and use it to control the spatial panning. Basically sending the sound into particular spot in the certain direction by pointing out into that direction inside of the octaphonic ring.

I used the help of ChatGPT to generate the code for Arduino Nano, which will wirelessly send raw data of the position of three axes, x, y, and z, to the computer.

The video below shows that the Serial Monitor of Arduino IDE receives raw data in the form of numbers from three axes: x, y and z. The cable is used here only to power the Arduino. This will be, of course, replaced by the battery power.

The next stage will be creating a Max device so I can implement the x, y and z data and transcript them into MIDI, which will be possible to map to a Max For Life device called Surround Panner.

Creating a Calibration device for sensors in Max and implementing it into Ableton Live

In the previous post I have described the problem which I encountered , a flex sensor not entirely covering the whole MIDI scale from 0-127 and having an initial point somewhere in the middle of the scale. My idea is that the flex sensor will be carefully positioned on the fingers of the gloves as well as in the bent points of the suit (under the knee, in the crook of an arm, potentially on the other bending parts of the body). I want the flex sensor to control the entire MIDI scale from 0-127 and make it mappable to any parameter in the Ableton in cooperation with Arduino Max For Live device.

Non calibrated flex sensor mapped to a Dry/Wet parameter of the reverb via Arduino device from the Max For Live’s Connection Kit

In the video above, you can see that the initial point of the flex sensor is somewhere around 40% (approx. number 43 on the MIDI scale) in of the Dry/Wet parameter and reaches somewhere around 80% (approx. number 86 on the MIDI scale). I will set the same input values into the calibrator to demonstrate its function in the video below. This time I mapped the output to the decay of the reverb.

As you can see, once the parameter sent by the sensor reaches the threshold set on the MIDI input (number 42), it triggers the output mapped to the reverb’s decay. It controls its full scale until it reaches the top possible value of the sensor (number 86).

Calibrated flex sensor ‘attached’ on the elbow

This calibration device has the potential to calibrate any other sensors sending any unstable values and stabilising them into desired MIDI parameters. It can be, for example, used vice versa as well when the input MIDI information sends the full scale of 0-127 and the output has a particular threshold and limit.

The core of the device is a zMap object, which maps the input range of values to the output range, and map button patch. Schematics for the patch in Max is below.

The calibration device will eventually be extended to accommodate multiple parameters (for example, controlling five different flex sensors attached to the fingers of the hand) with map buttons added to its inputs.

Circuit-bending the different strobe lights enabling them to be triggered by the signal from the CV gate

The idea was to have an analog eurorack kick module triggering the strobe light so their rhythm pattern matches on the go during the live performance. Firstly I was researching how to achieve such an effect digitally but in the end I have chosen an analog approach which appeared to be the most simple, straight forward and cheap.

I have used the relay to trigger the light with the gate signal. The most of relays require voltage at least 5V in order to open circuit but I found relay which opens with 3V. Trigger is set as ‘NO’ (normally open) so it switches only temporarily in the similar way like pressing a button.

Battery powered strobe light trigged by gate from Arturia Beatstep Pro. Arduino here is used only as 3.3V power supply.
Arturia Beatstep Pro gate triggers Kick module together with strobe light via signal splitter
Sketch of the schematics
Application of the relay to another strobe light powered by 240V. The discharge lamp is peaking at 400V.
Application of the gate triggered strobe light in the performance – I have positioned the strobe within a coffin chamber of the crypt under the church in St Pancras where I performed live on 29th October 2023.

Making an analog Arduino MIDI controller and testing the flex sensor

I have assembled the controller according to instructions in the video below.

It is a very simple four knob analog controller which can be connected with the Ableton Live via Connection Kit from Max For Live.

The purpose of this exercise is to find out how the flex sensor behaves when it replaces potentiometer. A flex sensor or bend sensor is a sensor that measures the amount of deflection or bending. Usually, the sensor is stuck to the surface, and resistance of sensor element is varied by bending the surface. Flex sensor therefore behaves in the similar way how does the potentiometer. By changing its resistance it changes the amount of electric current in the circuit which changes parameters mapped within Connetion Kit.

Flex sensor
Testing flex sensor replacing potentiometer

Flex sensor reacts however not in the same way how the potentiometer. I mapped the parameter Dry/Wet of the reverb. It only starts at about 30% and reaches up to 60% when bent. Next step will be to figure out how to calibrate flex sensor so it reacts fluently from 0 to 100% of the Dry/Wet parameter.

Industrial Violin (work in progress)

The other week I got myself very cheap violin for beginners. My friend asked me “oh, so are you going to learn how to play a violin now?”, I answered them “No, I will destroy it!”. Of course, I was joking. That would be horrendous thing to do! Although some aspects of what I am going to do with it could be considered destruction but actually I will be reframing the original instrument and using the beautiful resonance of violin’s wooden body for something else.

Project is inspired by the Instagram feed of musician Denis Davydov @davdenmusic. It’s a classical violin with a contact microphone attached inside of the body. I will be attaching various metallic objects like springs, kalimba or reassembled music box to the body of the violin. Some objects haven’t been found or decided yet. It is a fluid work in progress with many variables coming in and out whilst crafting this instrument.

Industrial violin will eventually become a part of my live performance and sound design practice.

Wireless Data Streaming Using BNO055 and Bluetooth and Estimating Orientation Using Sensor Fusion

The very first way of translating movement into sound via motion capture which I decided to experiment with is using the Adafruit BNO055 absolute orientation sensor. The sensor fusion algorithms and blend accelerometer, magnetometer and gyroscope data into stable three-axis orientation output processed in Arduino and send them to Ableton Live via Bluetooth.

Bosch Adafruit BNO055

During the programming and calibration, I have encountered several problems. Programming and calibration are done in software MATLAB which at first didn’t allow me to upload the code into the Arduino so it could work standalone. Bluetooth module which I ordered was not supported therefore I ordered different one so programming Arduino and calibration were executed via USB cable. Programming and calibration were somewhat successful and I managed to connect the device with Ableton Live via Max fro Live plugin Arduino although after I disconnected USB I had to write the whole code again and keep the device connected to the laptop. Later the device stopped reacting and after enabling Arduino plugin in Ableton Live, the program from MATLAB started showing an error. At this point I need to figure out how to upload the program from MATLAB into the Arduino chip.

From the top: Bluetooth Module (unsupported type – will be replaced by HC-05), Arduino Uno, BNO055
Simple wireless motion capture data streaming device assembled
Programming and calibration of the device in MATLAB
Various positioning of the device reacts to the Arduino plugin in Ableton Live and changes the parameters of the mapped Low Pass Filter in quite a simple way.

Loading the code to an Arduino chip from MATLAB appeared to me as a very complicated process therefore I decided to try a different route – loading code from the original Arduino software IDE. I managed to load the code and run the calibration test of the program via Chrome browser extension, still connected via USB. Next step will be to figure out how to connect BNO055 and make its principles work with Ableton Live either via existing Connection Kit or by finding or creating a device from Max For Live and figuring out the connection via Bluetooth instead of USB cable.

Sound Suit – Developing the idea and exploring Motion Capture

I have been for a while intrigued by the idea of translating physical movement into sound. The idea, which isn’t new at all, still fascinated me from the point of view of reversed dance. People react to the structured sound in the form of music all the time. What if we will do it the other way around, to compose the music by dancing, turning our bodies into musical instrument?

Probably the most known similar concept was developed by British composer and singer Imogen Heap when she introduced in 2010 her MI.MU Gloves, musical gloves which allowed her to control her music performance on the move.

I am aiming to create an interactive body suit, which will include gloves too, partially inspired by Imogen Heap’s concept, but include and employ different parts of the body like all limbs and possibly neck and hips as well and combine the use of the suit with an interactive laser installation.

I will be developing this concept in collaboration with music producer, singer, performer and kinaesthetic artist Ona Tzar. The creative input of ‘a dancer’ will be an important part of the project in order to accustom its function to the live performance.

Solo Duel by Ona Tzar

What will be used in the final piece and how it will work is at this point unknown but there are several features which I would like to achieve and keep them as basis. I am aiming for actual devices to be as discreet as possible and final suit to be also fashionable and to become an art piece itself whilst maintaining its high functionality. The suit will become an art piece in intersection of sound arts, music performance, dance performance and fashion.

I started to research various technologies and concepts which could be potentially included in the suit. At this early stage, I am exploring wireless data streaming using the absolute orientation sensor BNO055, Bluetooth technology, Arduino and estimating orientation using sensor fusion, ultrasonic ranging module HC – SR04 and flex bend sensors which were by the way used in Imogen Heap’s gloves as well.

Wireless Data Streaming System using Arduino, BNO055 and Bluetooth
Motion Sensing Device Controller using Ultrasonic Ranging Module HC – SR04 
Flex Bend Sensor