Category Archives: Collaborating

Custom LED system for music performance of Ona Tzar

One idea which emerged from my growing interest in creative technology was creating custom light system which could ad to a live music performance strong visual aspect.

Based on my previous work with programable LED strips and micro-controllers I have made MIDI reactive strips. Strips are circuit-bended floor lamps. Lamps themselves initially contained a chip for automated animations however I found them not very attractive as well as audio reactivity was quite basic. The design on lamps, portability and fact that they contain easily programmable LED lights inspired me to create something new. I have removed the chip and replaced it with 3.5mm female jack so the lamp becomes a part of a “screen” of 7 lamps.

Separate lamps are connected to separate outputs from Arduino Leonardo (contains chip Atmega 32u4 which allows direct MIDI control). Arduino Leonardo is programmed to receive MIDI notes from a DAW or MIDI instrument. Each lamp is a single MIDI note on a scale from 0 to 127. A single MIDI note can contain different colour, different pattern or animation. With 7 lamps we then have possibility of 18 unique series of animated colour patterns on a single MIDI channel. If we need more we can simply program more animations on different MIDI channel (16 in total, so in this case we can get 16 x 18 = 288 variation)

Light patterns can be either played live from MIDI controllers via Ableton Live sending MIDI note messages to the Arduino Leonardo or hand-written in piano roll.

For the live performance of Ona tzar we have decided to create hand-written piano roll MIDI clips so they can be as live light loops together with audio clips.

Light pattern hand written in piano roll – allows the exact timing and respons to the music or sound adequately.
Test of various animations with Ona Tzar’s single Hypnagogia
Test of various animations with Ona Tzar’s single Hypnagogia with aditional strobe lights connected to relays triggered also by MIDI notes
Final video of Ona Tzar’s live performance of Hypnagogia. The first video from coming up triptych of live performances .

Test performance of the sonokinetic crown and limb stretch sensor prototypes

Final outcome of the collaboration was going to be the short performance of all assembled pieces (core station assembled to the crop-top, latex based sensors and 3D printed crown) in the octophonic ring executed by the choreographer and kinaesthetic artist Ona Tzar following the composition and sound design by Declan Agrippa.

Unfortunately, Ona Tzar couldn’t join due to travelling and work obligations. Latex based sensors by Ella Powell aren’t at this stage finished yet, therefore we have used prototypes made of orthopedic support sleeves.

Performance was recorded on ambisonic microphone Sennheiser Ambeo VR Mic and MixPre-6 II Audio Recorder. First you can listen to the export of the Declan’s abstract composition made of field recordings and then the ambisonic recording together with the video.

Tilt of the head to the left controls the reverb. Both elbows control the amount of high pass filter on the two main tracks, therefore those sometimes completely go silent. Knees control the Brownian Delay, the ambisonic delay device from ambisonic package of Max For Live plugins called Envelope4Live.

Work-in-progress V: Latex Sensor prototypes and attaching lights to the Crown

Ella has been working on the prototype of a sensor based on similarity to the AI sketch presented earlier. She has made 3D latex tube which probably will be attached to the conductive rubber together with other latex pieces glued to the conductive rubber. The piece is still quite abstract at this stage, however further ideas how sort out right functionality are emerging.

LED strips were attached to the inside support structure of the crown. To attach them I have used resin glue and UV light for curing. There will be the total of four strips, two short on sides and two forks on the top. Each strip or fork will be possible to control and program separately. At the moment strips were tested with the external Arduino only to mainly test the functionality after attachment, but the idea is to connect them to the Core Station on the performers back. The light pattern will react to the data motion from the BNO055 sensor on the Crown.

Sonokinesis performance and collaboration in Czech Republic

For the composition I would like to create something using the wireless cybernetic MIDI controller, which i have been developing from September 2023. Since then I achieved various advancements and figured out a lot of flaws.

Last week I travelled to Czech Republic and I joined two collaborations. I wanted to experiment with, how it is to play with other musicians. Sonokinesis suit and gloves are at this stage simply MIDI controllers. They don’t generate any sound itself but only wirelessly control parameters within the DAW Ableton Live.

In Brno I have performed on 9th May 2024 at Rello Il Torrefattore, Rybarska 13, Brno. The venue is primarily coffee roasterty but it resides in the same building as Bastl Instruments, worldwide renown developer of Eurorack modules. A lot of experimental musicians and sound artists naturally gravitates towards this quite hidden and not very known venue.

We have built improvised octophonic ring in the garden and two friends of mine, Valentino and Jakub, were playing their modular systems and sending me two stereo signals. Movements of my limbs, head and fingers were effecting and modulating their audio signals and overall we have created improvised noise-ambient spatial soundscape.

Sonokinesis Performance at Rello Il Torrefattore, Brno, Czech Republic

During the performance I have realised the importance of spatial setup for the project. It could certainly be used in stereo or in ambisonics, however I believe that since the body exists in 3D space, the whole performance should exist in the space sonically as well. Therefore I will aim for spatial setup as much as it will be in future possible.

Sonokinesis Performance at Rello Il Torrefattore, Brno, Czech Republic

Next performance took place in small bar KMart in Ostrava on Friday 10th May. This time I joined with my cybernetic wireless MIDI controller noise and techno musician Jeronym Bartos aka Laboratorium Expiravit and violinist, vocalist Samuela aka Soleom.

The situation was fairly similar. They were sending me their audio signals and I was modulating them with the movement of my body, this time in quadrophonic setup. Due to nature of their musical taste we have created an improvised dark ambient soundscape.

Sonokinesis Performance at KMart, Ostrava, Czech Republic

At this stage I have been thinking how different is this approach from what I have been used to up until now. My body is becoming musical instrument in the sphere of electronically produced live music. I have never been a dancer however during both performances I have been acting and then also reacting back on sonic situations which emerged. From the choreographic point of view I have no idea how the performance could be perceived as I feel I am in the stage getting to know my new instrument and also getting to know what my own body can do in the sonic context.

Sonokinesis Performance at KMart, Ostrava, Czech Republic

WORK-IN-PROGRESS IV: Prototype of the Central Station done and taking measurements for latex sensors

Last weekend I have finished assembling the central station which will collect data from all motion sensors in the real time and transfer them via WiFi into the computer. I have attached it to an old crop-top of mine until later there will be created more elaborated piece.

Prototype of the central station for sensors attached to the crop-top
Future idea for the central station piece (A.I. sketch)

Measurements of limbs were taken and sent to the latex maker Ella, who is going to be working on the piece this week

A.I. sketch of the latex based motion sensors using conductive rubber
Example of motion sensors prototypes controlling effects

Inspiration for the new piece

3D printing process raised few questions for me. I have learned that the resin material is not recyclable. It made me feel quite uncomfortable and I started to think how I could use the waste from the resin support structure, which I found quite beautiful, interesting and eventually gave me some inspiration .

Resin waste from 3D printing

I got an idea for the audio-visual installation using primarily the resin waste. I would like to create a ‘plant cyborg’. The plant will consist from resin waste crystals and natural materials arranged into some structure. Crystals will be glowing from the bottom with the use of LEDs. Speakers will be hidden in the sculpture.

The plant will work as a clock. Different crystals will express different time frames – hours, minutes, seconds and sound will occur at specific time frames too.

The idea is to experiment with a change the perception of the time. Cyborg artist Neil Harbisson, who was born with achromatic vision got implanted in 2003 the antenna into his skull, which allows him to hear colours as sine wave notes. Over the time he described this new sensations becoming the perception. My assumption is that having such a clock around for long enough may eventually change the way how we perceive the time – in this particular case specific hour or time of the day may become in our mind associated with a specific colour and/or sound. this could maybe result new ways of thinking and having inspirations.

Work-in-progress III: Preparation and 3D printing of the headpiece design

Jing Xu and Tsunagi sent me ready final piece in the file of .stl format. I have uploaded final .stl file to the PreForm to get it ready for 3D printing. Since I haven’t done 3D ever before, I encountered few problems like fitting the piece into the right printer in virtual space and creating the supporting structure. All these were quickly resolved with help of the technician in the 3D Workshop at LCC.

PreForm for printing – created support structure.

Tuesday 23rd April 2024: I booked 3D workshop last week for today at 12 pm. 3D printing process takes about 11 hours and 35 min. Then I will need to remove the support structure with a snips, sand the surface and provide UV curing.

Freshly 3D printed headpiece before removing the support structure
Sanding the headpiece
Final headpiece

The headpiece came out from the 3D printer exceeding our

zexpectations. Measurements were madero fit Ona’s head however it comfortably fits the head of everyone who tried to wear it. The next stage will be attaching the sensor. Original idea was to paint it white but we agreed to keep it transparent.

Since the headpiece will remain transparent, I decided to attach on the insides programable LED strips WS2812B which will create light patterns based on accelerometer data from the sensor and work in parallel with the sound control aspect.

Experimenting with WS2812B RGB and Arduino

Work-in-progress II

@xiaji2tsunagi2 and @abadteeth have been in past few weeks designing the headpiece together. After presenting the first sketch (see previous post) @ona.tzar and I had few notes regarding the wearability of the design.

Original sketch – front part of the headpiece
Original sketch – back part of the headpiece

Both of us agreed that the front part looks very interesting as an idea, however the reality of wearing could become uncomfortable or maybe even dangerous for eyes. We proposed to remove the endings which are covering eyes. Ona pointed out also that the back part V split could be lower in order to create space for her hair ponytail.

Improved sketch of the headpiece
Improved sketch of the headpiece placed on the head model

The next stage will be creating a 3D scan and taking measurements of Ona’s head. The 3D scan and head measurements will be sent back to @abadteeth and @xiaji2tsunagi2 so they can make an appropriate size fitting in the software and prepare the project for 3D printing.

Work-in-progress I

At my end I have started to build the ‘Core Station’ which will process data from all sensors and transmit them via WiFi. It will be attached to the back of the performer. Based on the research I did, I have decided to upgrade the micro-controller from the previous prototype based on ESP32 to Teensy 4.1. ESP32s are still being used but only for the WiFi connection. Teensy 4.1 contains microprocessor ARM Cortex-M7 clocking at 600 MHz which is in comparison to ESP32’s microprocessor Tensilica Xtensa LX6 clocking at 240MHz significant improvement allowing fast and real-time data transfer from multiple sensors in the same time.

Teensy 4.1

Teensy will be gathering data from two accelerometer sensors GY-521 (MPU6050) attached to the feet, two elbow and two knee stretch sensors and BNO055 (9-DOF absolute orientation sensor) which will be situated in the headpiece. Data from sensors are going to be sent via UART connection into ESP32 WROOM-32U Access Point. I have been considering SPI connection but I struggled to find appropriate libraries at Arduino IDE and learnt that it will require to learn different IDE. I have tested UART which I am familiar with and it proved itself sufficient enough, however I still consider sorting out SPI connection in the future.

On the receiving end there is another ESP32 WROOM-32U which is connected to the computer and sends raw numerical data to the Max MSP. ESP32 WROOM-32U has a specific feature – possibility to attach external WiFi antenna. This significantly improved the data transmission and range.

ESP32 WROOM-32U with the antenna.
Prototyping the Core Station on the breadboard – Teensy board, ESP32 Access Point device (Sender) and IMU sensors.
ESP32 Client device (Receiver)
Testing the speed of data transfer
Testing the range of the WiFi connection

Emerging team

I have been seeking for a while a fashion designer or maker, someone with appreciation of similar aesthetics as I do therefore the result could become a common effort rather than an order.

After discovering the conductive rubber I realised that it could be efficiently combined with the latex. I approached a friend of a friend, latex maker and designer Ella Powell @exlatex.

Ella Powell have been working with creating latex clothes and sheeting for the past two years. She studied a short course in latex making at Central Saint Martins over the summer 2022. Currently she is studying a master’s degree in computer science and AI.

After initial meeting we have drafted some ideas about creating latex based organic-like futuristic looking sensors which will efficiently collect data from the bending knees and elbows. Below you can see AI generated idea of the direction in which the piece might be evolving.

Other artists which are joining the team are @Elixir31012. Elixir31012 is a multimedia artist duo formed by digital artist Jing Xu @abadteeth and sound artist Jiyun Xia @xiaji2tsunagi2 in 2023. Both graduated from the Royal College of Art with a degree in Fashion in 2023. Elixir31012 creates an otherworldly landscape of non-linear time through digital animation, experimental music making, wearable technology, and performance. Cyborg study, myth, ritual, and feminist narratives are intertwined in their work. Elixir31012 takes its name from the Chinese Taoist vocabulary for “elixir”. 3 represent “Synthesis”, 10 for the “Sacred”, and 12 for the “Eternal Return”. Their sound art performance at the the event Chronic Illness XX very intrigued me and we started talking. The idea to collaborate emerged very soon and organically based on similar interests in creative technology, cyborg art and sound art. Elixir31012 proposed that they will make a headpiece which would carry the motion sensor for the Sonokinesis performance.

Elixir31012 performing at IKLECTIK Art Lab
Elixir31012 performing at IKLECTIK Art Lab

Declan Agrippa @hiyahelix, student of the second year of Sound Arts at University of the Arts London, London College of Communication, is going to create a sound design using the virtual wavetable synthesiser Serum.

Below you can see the work in progress sketches of the sensor headpiece in Zbrush.

This image has an empty alt attribute; its file name is IMG_6456.jpg

Multi-disciplinary and kinaesthetic artist Ona Tzar @ona.tzar is joining the team as a performer. Her creative input is being very important for developing the whole system because we would like to have the garments as ‘user friendly’ as possible. We have been actively discussing materials, positions of sensors, shape of garments and the headpiece trying to find the right balance between ‘experimentalist aesthetics’ whilst keeping the usability of all pieces for performance comfortable, functional and reliable.

Sonokinesis: Part II – Drafting ideas for the collaboration

Last term I have introduced foundations of the project Sonokinesis – the idea of controlling the sound by the movement and other aspects of human body. I have made a pair of wireless interactive Arduino based gloves which allows to control the sound in visual programming language Max MSP and map them into Ableton Live via Max For Live. The piece has been performed so far on two occasions, at the Gallery 46 in Whitechapel and at the underground event featuring performance art and experimental sound Chronic Illness XXI. During those performances I have revealed many flaws which occurred and started to troubleshoot and upgrade the project – mainly unstable Bluetooth connection and the significant fragility of assembled pieces.

The idea of Sonokinesis certainly doesn’t stop at the pair of Arduino gloves and I aim to develop more stable and durable version of gloves followed by other garments allowing to the performer to encompass other parts of their body.

I have been experimenting with flexible pads for knees and elbows and created simple accelerometer based headpiece triggering MIDI notes or samples. All those are connected to the central mini-computer attached to the lower back with the belt. Central mini-computer is this time based on different micro-controller ESP32 Wroom-32 and wireless connectivity is sorted with WiFi connection which proved itself more stable and faster than Bluetooth.

Assembling wearable station ESP32 Wroom-32
Headpiece carrying sensor MPU6050 (accelerometer and gyroscope)

For knees and elbows I firstly assembled wearable pads based on the same flex sensors which I used for fingers of gloves. Unfortunately they appeared to be highly un-efficient when it comes to durability. Their function was limited by fragility and sensors started to break and rip after even single use which needs to be avoided at any cost since the piece must remain stable during performance and reusable. Also the cost of flex sensors is quite high considering its fragility (about £15 for one sensor).

Not long ago I have discovered conductive rubber which changes its conductive properties based on the stretch. I have tested a strip cut from the sheet attached to the knee pad and it proved itself very efficient, durable and in comparison to flex sensors also way more cheaper.

A strip cut from the sheet attached to the knee pad changing its electric resistance based on the stretch applied by the knee bending.