Moth Glitch (2024)

An homage to Stan Brakhage, using AI-generated noise and moth wings made of data.

Stan Brakhage’s 1963 book, Metaphors on Vision, is worth revisiting in an era where metaphors for vision abound in the way we discuss generative AI. Brakhage writes,

Imagine an eye unruled by man-made laws of perspective, an eye unprejudiced by compositional logic, an eye which does not respond to the name of everything but which must know each object encountered in life through an adventure of perception. How many colors are there in a field of grass to the crawling baby, unaware of "green"? ... Imagine a world alive with incomprehensible objects and shimmering with an endless variety of movement and innumerable gradations of color. Imagine a world before the "beginning was the word."

Is it appropriate to compare this to Generative AI? Machine Vision, maybe. But generative AI is not responding to a “world before the beginning was the word,” because the “word” is an essential variable infused into the calculus of the training process. Generative AI is the opposite of “an eye unruled.” AI models, when used as intended, don’t move us away from the bias of human vision, it constrains us to that bias. This bias is infused into training data, a bias that merges images into the categories of their descriptions, reconstructing links between words and what they represent.

Artists might try to break away from that constraint. Brakhage’s Moth Light is an example. Stan’s moths were real moths. His Mothlight film was said to be “created by painstakingly collaging bits and pieces of organic matter—moth wings, most notably, as well as flowers, seeds, leaves, and blades of grass—and sandwiching them between two layers of clear 16-mm Mylar editing tape.” The film that was projected was light being cast through the objects themselves, rather than the images of things captured within the Mylar film.

Brakhage looked at the mechanisms of cinema and asked how we might use the same tools differently. In using AI, we are expected to rely on the logic of images and associations in the training data. A computer is unaware of green, but the word green, the shred of text that results when we place a g beside an r beside two e’s and an n, becomes a reference to a limited, previously selected range of noise-infused photographs of grass patches. It’s an image of the image of the world, an illusion representing an illusion.

The first computer "bug" (though the use of "bug" to describe a glitch predates it). 

The software glitch presents a way of hijacking that illusion to make something apart from it. The first computer bug inside a computer system was a moth that found its way into a MARK II at Harvard in 1947. The team using the computer (which included Grace Hopper) taped it into the notes. This idea of the bug to describe a glitch was around since Edison’s telephone wires, but the MARK II moth cemented (or taped) its persistence into the computer age.

When I think of Mothlight’s moths, stuck between the lamp of the projector and the light of the screen, I remember the moth stuck in Relay #70 Panel F, a glitch taped to a notebook. Moth Glitch is a new experiment with AI-generated things: moths and music and noise and glitches.

Moths Glitch

Noise is at the heart of generative AI. It is what creates the illusion of creativity in these images and videos. Paradoxically, AI systems can’t produce noise, because noise is too complex. These are systems that require compression, and compression introduces bugs: gaps in information within the system that lead to a failure state.

I’m interested in pushing the boundaries between compression and legibility. In Brakhage’s Mothlight, real wings serve as a replacement for the imprint of light in the shape of their wings, into film. The moth is a glitch, a computer bug, a thing that disrupts the tidy compression of the world into a stream of bits or interrupts the stream of images that presents a hazy, hallucinatory effect of cinema. But noise is always one decision away from being considered a signal: a change in perspective, a change in priorities, and the problems of a medium become a use.

Too much complexity overwhelms any system’s capacity to compress it. For an AI system, this leads to glitches too. Diffusion models are designed to remove noise to find shapes in an image of noise. When I prompt it to create noise, I bypass internal systems that seek to recognize images. I steer the system away from representation.

Yes, when I prompt noise I get an image of noise. However, the resulting noise is not what’s in the training data. It’s noise produced by an actual, technical failure in the model: a prompt for noise is like a moth flying around inside a digital infrastructure, rearranging patterns of light into colorful, swirling smears that push the boundaries of compression, creating moire patterns and image breakdowns and new things for us to see. Things without references in the data.

Moths pollinate in darkness, affecting even more of the world’s plants than bees. We don’t see them, flitting about behind the interface of the world gone into dark mode. We know them as suicidal, but their attraction to flame and porch lights is misunderstood. Willow Defebaugh writes in Atmos:

Rather than being drawn to the light, moths become “trapped” by it. In a series of experiments, scientists observed moths flying around artificial light sources with their backs to them, even upside down, in endless loops. Rather than flying toward the light, they deduced that moths always fly with their backs to the brightest source—an evolutionary gift that lets them tell up from down while traversing the dark

In other words, the moth is trapped in a glitch: the desire for the brightest light to be at its back drives it into circling that light in a spiral, always arching its flight path against the overwhelming glow.

The Noise Glitch

The glitch part of Moth Glitch is the swirling bursts of shifting texture and color behind them. These are accidents of the system, generated by asking the system to generate the very thing it is meant to be filtering out. In the foreground is the moths. There is something whimsical about watching what passes as moth flight within an AI video generator. Yes, it’s bad at it, but I find it a charming insight into the system’s limits, at odds with the imagination of AI as a superhuman physics engine modeling our world.

I’m working with traces of real months, recalculated from the dataset into a renewed simulation of animation. Brakhage was working with the moths plucked from his window.

I think about Mothlight quite often when I work with AI. It raises an absurd question: how do you get at the materiality of AI through its output? Here, I’m at risk of imagining materiality too literally. That would mean data centers and fiber optics. Materiality is not the right question at all! But I want to see what the wrong question gets me.

There is constant slippage between what is represented and what represents it. The digital requires an immersive imagination to operate: we need to believe that we are submersed, that images represent what is there, even if it is half-formed, like a daydream of a moth. Whatever social media has eroded about images, AI has pushed into the sea. Instead, what is there can only be indirectly referenced as traces in the outcomes. The trace left in the generated image is a residue of its collective references. Lots of moths make the moth.

Stan Brakhage described making the film as an effort at no-camera cinema, a way of assembling images onto film and projecting them without being processed through the apparatus of the camera.

“I tenderly picked them out and start pasting them onto a strip of film, to try to ... give them life again, to animate them again, to try to put them into some sort of life through the motion picture machine,” he says in a director’s commentary on the Criterion DVD.

Moth Glitch is perhaps the opposite of this: animating the data of no moths in particular.