Text to Speech Player

Acoustronic

About

Acoustronic is an award winning inclusive ensemble based at Ulster University. Acoustronic was formed by Professor Frank Lyons as part of his research into Inclusive Creativity. They have performed internationally and contribute towards research into inclusive/accessible music performance, composition, software and hardware design. Acoustronic were the artists in residence at the Royal Irish Academy of Music and were the model used to create three new ensembles around Ireland and directly contributed to the formation of the Open Youth Orchestra of Ireland (OYOI).

More on Acoustronic can be found at InclusiveCreativity.com but on this page I will display some of the work that I have done with these amazing musicians.

String Quartet and Electronics

In 2016 professor Lyons composed a work for the Benyounes String Quartet and Acoustronic called NonZeroSum. After the debut performance of this work Professor Lyons held a workshop for composers. I was one of the composers in attendance. Lyons’ work partially used graphic gestures displayed on monitors to illicit improvised performances from the Acoustronic performers and the string quartet. I developed the simple application Lyons used to to do this and worked with him during the development of this composition.

I took this drawing and gesture idea to its next stage and created an application that allowed drawing to output audio while also allowing a composer such as myself to create accessible instruments quite rapidly that used drawing and resizable buttons as its interface. I used this application in my composition ‘Galvanised for String Quartet and Electronics’ when I attended Lyons’ workshop. Using this software I was able to create unique instruments for each performer and supply basic graphic scores to those performers. Galvanised was performed by Acoustronic and the Benyounes String Quartet alongside NonZeroSum at the VSMM conference in 2017, UCD, Dublin and in the Calouste Gulbenkian Foundation in Lisbon.

 

The application was called Wappic (Web Application for Inclusive Creativity) and before I moved on from it I began to explore the Leap Motion controller as a possible device for accessible performance.

Derry Jazz festival 2018

As a jazz musician with a lot of years under my belt and while studying a Masters Degree at Ulster University, I was asked to perform at the Derry Jazz festival in 2018. I was working with Acoustronic at this time also and decided that an original 30 minute composition for jazz quartet and electronics would be a more impactful way to approach this concert than the usual playing of standards (though I threw a few standards in to keep the purists happy).

The influence for this was watching the Benyounes String Quartet and one of the members of Acoustronic musically communicate spontaneously. At a rehearsal the members of the quartet were warming up and playing some musical lines….. scales, fiddly things etc. One member of Acoustronic who happened to be using the iOS app Thumb Jam started to reply to these musical sentences. There then began a musical conversation of call and response between the two. As a jazz musician this sounded like heaven to me and inspired me to create a work for the ensemble that would entail call and response elements with the jazz quartet.

The 2D interface of an iPad used in the previous work for String Quartet and Electronics was limited in expression. There are only really two parameters to map; x and y coordinates. For this composition I decided to develop an interface for Leap Motion that would allow for greater expression.

Absent Conductor

I was able to provide graphic scores and conduct Galvanised but for this concert I was going to be one of the performers (I’m a saxophonist). The quartet consisted of myself on sax, Scott Flanigan on piano, Rohan Armstrong on Bass and Darren Beckett on drums. Each one of these musicians is a highly skill jazz musician and very used to the looks and non-verbal gestures used by jazz musicians to communicate on stage. Acoustronic, however, were very used to be conducted directly by hand gestures. I decided I’d build on the use of monitors that we used for Prof. Lyons’ composition and create an absent conductor who would direct Acoustronic by signalling hand gestures on these monitors. The absent conductor sat at the side of the stage with my score and when cued by the score, sent network messages to an application running on the performers’ computers. The performers saw animated hand gestures and traffic light signals communicating how to perform and when to stop and start. 

Here’s some of our sessions exploring this idea as I composed. The composers amongst us always involve the performers in our composition processes, seeking their advice and collaboration.

John Lynch explores the Leap Motion application created in Max/MSP and later gives me advice on how to improve it.

Jay Hagon helped me explore a novel way to use the Leap Motion. We suspended the controller using a mic stand and through some experimentation we discovered that the infrared cameras of the controller were blind to black PVC, so we covered a table with black PVC and used the table as a place to rest his arm and perform. Without it the controller could not distinguish the hand from the table.

Here I get to dig in with Acoustronic and explore the opening of the composition. This was a lot of fun. Marie Anderson’s delicate and almost dance-like interpretations of the hand gestures is impressive.

And here is the ending of Coruscation for Jazz Quartet and Electronics where the quartet and Acoustronic share improvisations. Coruscation, meaning a flash of wit, is an an anagram of Acoustronic and flashes of wit are common in our weekly rehearsals. 

Saxophone and Electronics

Towards the end of my masters degree I wanted to explore iconography as a possible way to communicate between myself as a performer and Acoustronic. This was very similar to the Absent Conductor idea but the iconography could contain more abstract communications. The saxophone contributes to the overall electronics when the Max/MSP app records samples and manipulates them in real-time alongside the performances of the Acoustronic members. 

We performed this at a symposium for Inclusive Music Making in Dublin with the Royal Irish Academy of Music to announce our research collaboration on the Le Cheile Project. Here’s myself and two of the Acoustronic members working on the composition.

 

Le Cheile and Open Youth Orchestra of Ireland

While I was finishing my Master’s degree and preparing my PhD proposal, Professor Lyons asked me to be the technical lead in a joint research project between Ulster University and the Royal Irish Academy of Music. The project was called Le Cheile (‘Together’ in Irish) and we were going to create three new ensembles around Ireland based on Acoustronic. We would also bring those ensembles together to form the Open Youth Orchestra of Ireland. 

As you can tell, technology is very important to us and serves as a fantastic route to breaking down barriers to access for musicians with disabilities and of course, its my bag to develop that software in creative new ways as a composer with programming skills. So I leapt at the opportunity to be a part of something that was ground breaking here in Ireland.

My remit was to not only to decide on where we spent money regarding technology but also to demonstrate to our academic partners around Ireland how we might use that technology in an inclusive and creative way. My PhD had begun by the time we had started this project (I had no break between masters and PhD) and, of course, my work now looks at the use of VR for inclusive music making. So I suggested that we should explore a VR composition as part of our debut performance. The intention was to do something big and demonstrate to our partners and others just what is possible with new technology, hard work and collaborative/participatory practices.

I can’t go into the development of the VR composition too much because it is still a part of my PhD research. In fact the performance system was the prototype for WithFeelVR. But Trip Tick for VR and Electronics was performed at the debut performance of OYOI in Athlone Institute of Technology in September 2019. Here’s a sneak look at one of our rehearsals.