Last edited 7th July 2024
Live programming is uniquely suited to creative work. It can remove many of the creative blockers that individuals experience when trying to produce it. But we could place much more explicit emphasis on the removal of emotional blockers from the creative process, as opposed to only focusing on intellectual blockers. Arroost is a project that seeks to do that — an experimental live programming tool for making music.
For me, creative work is when you use your imagination to "make something". This could be performing a song, writing a book, painting a picture, drawing a diagram, baking a cake, giving a speech, designing a game, or coding a program.
In any creative work, the person doing the work might experience blockers that stop them from continuing. Let's think about what those blockers could be. Here are some examples of potential blockers.
I'm sure you could think of some more examples of your own.
We could categorise all of these potential blockers into one of two categories:
Intellectual blockers are when you don't know how to solve a problem, or create what you intend, even if you know what you want. You might not know how to use a tool, or you may not have the right materials. Your tool might be insufficient! Your paint bucket was too small, or your coding tool was too limited. Or your programming environment doesn't give you enough realtime feedback, so you have no chance of understanding how it works.
Emotional blockers are when you have the tools and materials and skills that you need, but you're feeling too nervous, or too scared, or too embarrassed, or too distracted.
Let's organise our example blockers into a table.
Intellectual blockers | Emotional blockers |
---|---|
"Oh no! I ran out of paint." "I don't know how to use this programming language." "I can't visualise what I want to make." "It's not producing the thing I want and I don't know why." |
"I feel too nervous to go up on stage." "I suck at drawing. It'll look stupid if I try." "There's too much pressure on me to be good." "I've ran out of ideas." |
Live programming has often been paired with creative work, the current canonical examples being Bret Victor's Inventing On Principle and Dead Fish talks.
And Ink & Switch continues work in that vein, with live programming projects like Untangle being used to explore different possibilities for generative art in a playground-like setting.
It goes without saying that live coding itself combines live programming with creative work. And Alex McLean's Algorave continues to foster an international community of live coders, using tools like Strudel.
I can guess why the two have been paired up — they're a great fit for each other. Live programming's immediate feedback encourages exploration and play. This allows the coder to discover new possibilities, and to feel their way through the tool. And the emphasis on tangible representation of code allows the coder to directly manipulate their work, and visualise what is going on. They can see it.
From these benefits, I can see how live programming removes many creative blockers.
Personally, I feel that there has been a lot of attention on removing intellectual blockers, and there is still far more to explore in terms of removing emotional blockers. I think there is a lot of untapped potential there (which I will now try to convince you of over the course of this essay).
We work alongside/within the tools for thought space, where interest is around supporting a person's thoughts, enabling them to use the computer as an extension of their brain. And this ties into the ever-cited bicycle of the mind metaphor. The computer is a tool that magnifies our intellectual capabilities, just as a bicycle magnifies our physical capabilities.
In my opinion, these metaphors reveal a bias towards tools that remove intellectual blockers. They acknowledge that our brain has limits, and they seek to support it.
If that is true, we could also conceive of a set of tools that help us to overcome emotional blockers. We should acknowledge that our emotions have limits, and we can use computers to help support them. Perhaps we could call it... tools for feeling... no that's too cheesy... how about... bicycle for the heart... no I don't like that either. Let's move on.
It turns out, there are many tools out there that try to remove emotional blockers. We can learn from them when making our live programming tools.
tldraw is a whiteboard tool and library. It's interesting because it's all very wobbly. Whenever you draw, or write, or make shapes, all of your marks are imperfect. Everything looks slightly wonky and scrappy.
This wonkiness is intentional. tldraw makes it difficult to line up your work perfectly. It's so hard that it stops you trying, and you carry on drawing and writing instead.
And tldraw presents you with a very limited number of colours to pick from, twelve in total. You don't waste time obsessing over finding the perfect colour because you can't. You just pick one and draw.
Sandspiel is a video game that simulates different elements on a pixel-by-pixel basis. Elements like sand, water, fire all behave differently and interact together in combinatorial ways. This kind of game is known as a falling sand game.
The unusual thing about Sandspiel is that most users don't use it as a falling sand game. They use it as a drawing tool instead. Why would they do that?
Why would anyone draw in a falling sand game and not an actual drawing tool like Photoshop or Procreate or Pixlr, or even Microsoft Paint, or even just on a physical piece of paper? Sandspiel wasn't (originally) designed for drawing, so it seems like an odd choice.
But the occasional commenter explains why: They've already tried those "real" drawing tools, and it goes badly. They can't handle the pressure of the blank canvas, and the expectation to make something good. But in Sandspiel, they feel free to draw to their heart's content. And they feel free to share their work with the world.
And with Sandspiel, there's some warmup time. When you enter Sandspiel, you have something to play with. You can place down elements, and interact with the game while you're deciding what to draw. You might get inspiration from the game itself.
Sandspiel Studio is a live programming tool that I worked on with Max Bittker, the creator of Sandspiel. It's an end-user-programmable version of Sandspiel. Users can use its block-based code editor to change how the elements behave and look.
We tried to lean into the same strengths that Sandspiel has. It doesn't start you off with a blank canvas. You start with a palette of existing elements, and you can begin by editing them, and playing around. You're not forced to start from scratch (like in Scratch), which is much more daunting. The goal of this is to stop the users feeling afraid. The goal is to remove emotional blockers. They can take their time to get familiar with the system, before jumping in.
Algorave is a practice where someone live codes visuals and music, and a crowd dances to it, in response.
This puts the live coder in a vulnerable position because there is a certain amount of pressure involved. If they make a mistake, it could affect the whole crowd.
However, the fact that it's being done live as a performance means that the expectations are adjusted. People can meaningfully expect that there will be some mistakes, and that's actually part of the fun. There is a culture of embracing the mistake within live coding and Algorave, and that removes emotional blockers around failure.
Toshio Iwai is an artist who has developed various live programming tools for making music. These include Sound Fantasy and the more widely known Electroplankton.
Both tools provide the user with puzzle-like musical instruments. But using them feels less like playing an instrument, and more like playing a game — interacting with an environment. As the music-maker, it feels like you take a secondary role to the tool itself. You cannot control it in its entirety, you can only influence it to an extent, to try to get the simulation to make different kinds of sounds. It is the tool making those sounds, not you, and this takes some responsibility away from you. There is no pressure on the user. There are no emotional blockers.
Maywa Denki is an art collective disguised as an electric company. They produce musical instruments known as nonsense machines, some of which are programmable.
These nonsense machines are interesting because, despite being toy instruments, they are not designed to be easy to use. Instruments like the Otamatone are intentionally hard, and can sound terrible in the hands of a beginner, even grating. The effect of this is that there is no pressure on the user to perform or create something good. The expectation is that it will sound bad, so there are no emotional blockers.
Is it possible to try to combine the strengths of these tools into a single live programming environment? What would that look like — sound like? Would it be good? Would it be terrible?
To try this out, I made a live programming tool called Arroost. It's pretty buggy.
When you first open Arroost, you're presented with a message.
When you click, a single shape slides onto screen. As an introduction, this first moment is important. It shouldn't feel like you 'created' this shape. It should feel like a surprise. And it shouldn't slide to where your cursor is — that would be too helpful.
This should tell you that you're not in control. Arroost is in charge.
At this point, you are able to drag the shape and the world around. It's a chance to get a feel for the physics of the tool. Every object, including the camera, has inertia. They should feel like physical objects. It should feel like a simulation.
The user can click on the shape to pull out a line.
By clicking again on the canvas, it creates another shape.
This second shape is a Recording shape. Click it to start recording, and click again to stop.
After recording a sound, you can click the shape to play it back. And the noise can be adjusted by moving around the shape. If you drag it higher, the sound gets higher pitch. If you drag it lower, it gets lower pitch.
You can also adjust the horizontal line on the shape to change the starting offset of the sound.
You can feasibly spend a lot of time just playing with the shapes and sounds. I would like to point out that this all happens before you start doing any live programming. I think that it is important to let the user play with a creative tool before they start creating anything substantial. This is similar to what was seen in Sandspiel and Sandspiel Studio, and it's a philosophy that I followed with CellPond too, one of my previous projects. In CellPond, you can spend a lot of time just drawing before getting started with any live programming. It gets you used to the system.
By letting the user play, you remove the pressure and stress of the blank canvas. At some point, they might hear back a certain sound that inspires them to live program a song.
Recording sounds yourself (eg: with your voice) can be intimidating for some people. This makes it a good candidate as a use case for exploring how to remove emotional blockers. One way this is overcome is through mess.
By dragging the Recording shape up and down, the sound changes pitch, and sounds distorted. It intentionally sounds very silly, which can serve as an "ice-breaker" for a timid user. It uses comedy to relax them, before they begin to take the tool seriously.
After creating a few recordings, the user is presented with another shape. The Connection shape.
The Connection shape allows you to connect two shapes together.
Now, when the first shape is clicked, it will trigger the second shape at the same time. I call this firing. When you click a shape, you fire it.
By connecting two sounds together with wires, you can start to create more complex noises.
You can click the triangle button on a wire to change the timing of a wire. Click it once to change the timing from "on the same beat" to "one beat later".
This lets you create a chain of sounds to make sequences and loops. This lets you start to make songs.
Notably, it's really hard to time your different sounds perfectly together using this approach to timing. The timing of your recordings and sounds can get very sloppy and arrhythmic. This is intentional. The user should not spend too much time on getting their sequences to line up well. They should be encouraged to continue recording and arranging their sounds in a way that encourages creation, not perfection.
The user can adjust the precise timing of sounds by dragging a sounds's horizontal line left and right, but they aren't given much visual feedback on this — just a low fidelity horizontal line. It's something they have to "feel out". They can't obsess on lining up sound waves with each other, like you can in most audio editing software. This is something that I have been emotionally blocked by in the past.
There is also no capacity to edit the sound itself. There is no noise reduction, and no effects available. There is no decision paralysis around this. Like in tldraw, you are given a limited set of tools to work with, and everything you make is slightly imperfect. This emotionally unblocks you from obsessing over details.
This is something that community member Shane Crowley experienced and wrote about in his blog post. He stated:
"One of the factors that makes recording vocals difficult is the need for noise-reduction. Usually this requires recording some ambient noise and subtracting it from the recording. This needs continuous tweaking and can result in poor quality audio if you're not in a good recording environment (I rarely am). With Arroost I couldn't do noise reduction, so I stopped caring and just focused on recording. "
When I first added the pitch-shifting feature to Arroost, I received some negative feedback from users. They said it was annoying that they couldn't move around their sounds without it also changing the pitch. Sometimes, they liked to arrange their canvas in a certain way, and keep it clean and neat, and now they couldn't do that.
This was great to hear, because it meant that the pitch-shifting feature was stopping people from obsessing over the cleanliness of their canvas. It was making mess the default state of the tool, which takes the pressure off.
If a user does want to move a sound, they face a difficult choice. Do they really need to move it, and risk ruining its pitch? Or can they leave it be, and carry on creating?
By clicking the timing button again, you can change the timing of a wire to "one beat earlier".
Let me be completely clear. If you set a wire's timing to "one beat earlier", it will trigger the second shape before the first shape is fired, in a way that seems to break causality and the rules of time.
This was very very hard to program.
This phenomenon plays around with the concept of responsibility. It leans into the idea that the user is not causing the sounds of the instrument. It suggests that Arroost itself is responsible for whatever's happening in the simulation. The sounds playing are because of Arroost — not you.
This time travel effect feels so unintuitive, and so hard to follow, that the user should have no hope of properly reasoning about it, or understanding what's going on. They should give up on trying to "own" their creation, and instead focus on trying to influence it.
Similar to the philosophy of Electroplankton and Sound Fantasy, this focus on simulation serves to remove pressure from the user. It removes emotional blockers by lowering the stakes. Live programming offers this affordance of confusion, and it's something that we can lean into.
"It's not my fault that it sounds bad. It's Arroost's fault."
Arroost has many more features that allow for more complex behaviours. For example, you can change the colour of a wire by clicking on the square button.
Coloured wires let you fire further wires conditionally, which lets you create more complex structures like logic gates.
You can also create a Storage shape by deleting nothing. This allows for even more complex computations.
However, for most users, these features are far too complicated to use. I would like to make this sort of creation easier in future work, but currently, the easiest way to get Arroost to carry out a complex behaviour is to perform it.
Intentionally or otherwise, Arroost encourages live performance, which changes expectations around the music produced in it. It lowers the bar, and mistakes are expected. This is similar to what we see in Algorave and live coding, where mistakes are embraced as part of the culture.
Finally, it is worth noting that Arroost, as a tool, is not in a vacuum. Throughout its development, various members of the community contributed to it as an open source project. I introduced a mantra of "normalise sharing scrappy fiddles" which was picked up by community members as a way of encouraging each other to share their work, no matter how bad they thought it was.
In this sense, the community around Arroost is another "feature" that serves to remove emotional blockers.
This is something that I have only recently started to understand. It is something that I am learning from engaging more with the live coding world. I also see links to the community of Sandspiel, and fans of Maywa Denki's Nonsense Machines.
As researchers within live programming, we should see the tools we make within their context. They do not exist in isolation, and it would be a failing to ignore the international communities that already exist around live programming, namely the Algorave and live coding communities.
A tool can only succeed within a community.
I was extremely happy to see so many people use Arroost to create music and sounds. For a while I collected examples on my examples page but there became too many to keep up with, so I stopped updating it.
I was most pleased to read Shane Crowley's blog post about his experiences with Arroost. It confirmed many of my goals around the project. I encourage you to read it to get a first-hand account of how emotional blockers can be removed. Here are a few choice quotes:
"Arroost was a new way of making music for me. It made the process more light and joyful than some of my recent painstaking DAW sessions."
"My discovery of Arroost led to a sudden burst of music-making."
"Arroost got me back to a place where I could make something."
My hope is that this project is the start of an ongoing conversation around the different kinds of blockers that we are trying to solve with our live programming tools.
I want there to be more emphasis on the real-world emotional blockers that people experience. And I want that to become a potential angle of critique for live programming tools in general. I do think that it is already a prominent concern of many of us in this space, so I want to encourage everyone to discuss it more explicitly. There is no shame in admitting that you are trying to solve emotional blockers with your tool. I think it's a strength, so you should proudly state it.
And I'll end on this. I do see many researchers trying to make tools that "feel good". But that's not enough. Sometimes, feeling good is not the same as feeling free from emotional blockers. Tackling your emotional needs is sometimes uncomfortable— it can be hard— it can make you feel bad, or frustrated, but I think it's worth it.
Thank you for reading my essay about my silly project Arroost :)
If you would like to learn more about Arroost, here are some things for you.
Lu Wilson, 2024, Todepond Explorations
todepond@gmail.com