top of page
Working on now, coming soon!

AI themed projects...

Some projects involving A.I...

Multi layered, AI Augmented, Analogue Interactable Dance Project

The students gained an arts award from participating

Some of the dance was turned into animation, that then went back into the dance :)

Re-dancing the dance. Using movement based controllers to re-interpret the original means the digital creative process also feels performance like. 

Also a kind of dance group at work

This project with Darcy Kitchener, Fermynwoods Contemporary Arts, C2C Artists in Residence and Grange Academy culminated in a tech augmented dance performance that combined live dance, AI and digital interpretations of the dance, brought together by unique methods of human/computer interation. All of which earned the students of thingy school an arts qualification! :)

(A full video of the main performance isn't shown here because some of the participants were from vulnerable groups)

Students from Grange Academy began by creating a dance routine under the guidance of Darcy Kitchener. This first level performance was filmed and used as source material for a next layer of digital creation. The students interacted with A.I. based process using a selection of custom, kinetic-feeling control devices, ranging from a relatively conventional drawing-with-a-pencil feel, to a device that controlled digital time with a spin of the hand, to a kind of computerised Theremin. This allowed the creation of highly processed and unexpected outcomes, but with a physically intuitive experience and a live, instantaneous performance-like quality to the process. This resulted in a high-res movie and a low-res pixelated animation. The pixel animation was placed onto custom made hand-held light tiles and used as props by the dancers to create a second iteration of live dance. The final piece was a live performance with the dancers using the animated props with the movie projected behind, creating a deeply nested experience, involving many layers of creation and interpretation.

One interesting discovery during the building of the tech was that the simplest and most effective way to run the A.I. code was inside a web browser. The controller devices were then made browser compatible by tricking the computer into thinking they were musical instruments :) Which opens the possibility of a version of the project where people could participate in the digital creation process from anywhere via the web...

Toggler - AI powered webpage takeover

Stuart-Moore-Toggler.jpg

A response to Fermynwoods commission for a 'takeover' of their website. The piece consists of code inserted into their website which could be optionally activated by website visitors. The artist then has hacker like free reign to alter / enhance / disrupt the organisations carefully crafted public presence!

To create this piece an AI process watched people moving and gathered data on their motion. This small amount of imperfect data becomes the animated stick figures - a view of what the AI saw when it looked at people. An opportunity to ask both, what does the machine see when it looks at you? And, what do you see when you look at the machine? Do you correctly understand the relationship you have entered into? The figures are aware of the structures of the web page and like to dance on the elements that can support them. In that sense, they really do live inside the digital entity of the web page. 

"It’s easy to feel while you are looking at a web page that it’s a solitary experience, but you are in fact in a public space. Details of your actions and in-actions can be and are observed, and while you look at the screen, the machine looks back at you.

Our intuition for the most part hasn’t caught up with the idea of data. The characters in the animation are not people, but they have come into contact with people. They are noisy and incomplete traces of data that have been gathered by a simple AI process, but something about our humanity is easily strong enough to show through and communicate a surprising amount to anyone that chooses to look. For me, I think that’s the most remarkable part – what a powerful mark we leave on things. What does your data self look like on the other side of the screen?"

(almost) instant AI video and some experiments in separating speaking

DSCF7614.JPG

This video was made with students excluded from mainstream school and was completed in an afternoon. It uses essentially the same AI tech to the Toggler example above and was efficient enough to run on a Raspberry Pi :) An interesting twist here is in the audio which contains the result of an experiment in using AI to separate the expressive and 'purely verbal' elements in spoken language. Spoken 'lyrics' were interpreted by AI into text, removing any expressive or 'musical' audio elements, and by another AI process back again into speech. Its fair to say that theres not much left unchanged! In my mind this video is called 'Loitab loitab loitab'. (The other part of the music is Daft Punk)...

Clams, crabs, flies and a DIY cybernetic snail eye 

Screenshot from 2023-12-07 20-24-29.png
Theyre_all_just-_a_change_in_entropy_to_me.jpg

Above: 1. The snails eye was published in Hackspace magazine. 2. Its possible to watch the thought / seeing process of the cybernetic crab eye in progress. The three images represent a progression that starts with more literal sensory data, and moves towards greater abstraction and greater certainty. Its probably true to call the most abstract end the 'seeing', which in this case is not best described by the pixels, but by a feeling of confidence that the thing you are looking for is present.

This group of projects is a little too large to cover in much depth here, but represents quite a fundamental thread that runs through much of the ongoing work. A collection of projects have resulted in cybernetic 'eyes' based on organic equivalents. The first was a clams eye - developed somewhat by accident as a solution to a data gathering problem! Many types of eye are very closely related to each other and so following from the clam were developed vision systems related to a flies eye, a crabs eye and a snails eye. The fly eye was also built to provide a solution to a tricky data gathering problem. The crab eye was created to see hand movements and use them as a way to control video. The snails eye was created as a way to try and understand the most fundamental aspects of vision and as a part of the workshop Little Robot Heartbeats. The device allows you to perceive and explore in snail like terms (link here when I get around to it!). It was also published as a DIY project in Hackspace. A snails eye is extremely simple and perhaps the most interesting thing is that 'seeing' at this fundamental level is not about image as we would tend to understand it. Rather, it would be fairer to describe it as feeling at a distance. Moving up through the complexity, towards the world we are more familiar with, another insight from these experiments is that vision appears to be driven first by the question 'what are you trying to see?' There are a great many ways to see and it seems for instance that toads may be unable to perceive vertical lines! Further investigation is ongoing... :)

The Sound Of The Space You Displace

"Loved this piece. The little robot that's attracted to faces was my best friend!"
 
SSYDtitleimage.jpg
Airflow_Machine_large1.jpg
geocloseup1.JPG
geoscreen1.JPG
secret_disco_smaller.jpg
TheSoundOfTheSpaceYouDisplaceAtMKGallery
This interactive sound based piece was a prize winning installation at Milton Keynes Gallery as part of the exhibition MK Calling. It explores some of the subtle, unconscious and involuntary effects we and the world have on each other. Inputs including very small displacements of air, and subtle vibrations - the kind you cant help but make simply by being there - are expressed as part of the composition. In a way, making audible the music we are constantly surrounded by but not usually aware of. 

An AI based component is a little robot 'creature' that's very interested in faces. Its very difficult not to perceive it in social terms and data from your interaction feeds into the live sound composition. In this way even something about the social space becomes part of the sound :)
Above: developing the robot creature.
Theres more on this piece here..

Some human/machine creative collaboration games

colab1Edit.jpg
drawingMachine.jpg
My work has featured a few projects about human/machine collaboration. This one is here because it uses a type of AI, was a lot of fun, makes some interesting points rather simply, and its possible to put some of it in a web page so you can play with it :)

In this little creative game, a human participant makes a simple sketch on screen. Simple and sketchy are pretty much unavoidable since a mouse is not a very subtle drawing tool! The machine participant then makes a sketch of the first sketch. This time 'sketchy' does not come so simply - but is an essential ingredient. For a successful process the machine must be error prone, or noisy, in the right way. Its not an easy thing to define or create algorithmically - the right kind of mistakes! Once the machine has made its sketch that interprets the human sketch, it draws it with a pen on paper. Finally, this drawing goes back to the / a human to be interpreted and altered one more time :)

The final outcome is about those unexpected, happy surprises and opportunities that are central to creative processes. Its also about the way that interaction with a machine can be a  friendly and playful experience - not by programming or contrivance but genuinely, by its nature. And its also about the observation that 'biological noise' may be a little hard to find or define, but once you have it, its suspiciou
sly familiar!

You can play with a version here. (Cant do the bit about machine drawing with a pen, obvs!)

 
bottom of page