The path from vague idea to practical reality is made of the lessons learnt as we engage with material form.
This engagement often begins with a simple determination to make something physical. That determination alone forces you into a different mode of enquiry – you are forced to confront the complexity of the materials you are working with and in the process, and at every stage, you are confronted with completed unanticipated challenges.
Once we decided to build a hardware programmable robot we were confronted with the question of what it takes to make and interface that balances between points, or spans the chasm, between useful creative manipulation, and hands on simplicity and immediate tactile engagement.
Two Approaches – One Interesting Question
Etienne immediately went with the aesthetic that spawned the research project from the outset. To create a robot programmed by an array 3*3 array o switches. His intuition is work within this limit so that his abstraction guided and controlled by that limit.
I, on the other hand, jumped to the try to work out the limits and qualities of the abstraction first and to work out what hardware interface will be required to operate it. From there I thought I can boil things down to the smallest and simplest instantiation of that abstraction.
Etienne’s approach encourages him to just build the project and let the material speak to the abstraction from the outset. Mine (typically) involves a lot more anticipation of potential and potential contingency before we even begin making.
Starting with Abstraction.
In the end I decided that at the very base of the abstraction is the combination of movement of a particular length (a travel) and a turn of a particular degree (a turn). I can then extend this travel and turn abstraction to any complex shape – from line to circle.
For this I need a switch to turn the function on and off, a rotary encoder to change the value, and a screen to reflect and feedback those changes. I can then easily economise on the need for a switch and a rotary encoder for each programmed draw function and instead use under encoder click through the values for each function (travel and turn – a rotary encoder generally has a pushbutton function as well as a dial function).
In this economisation (one encoder, one screen) with multiple functions I’ve found a form that I can quickly prototype and begin developing code for – but – you can see how my approach quickly leads to a degree of complexity in interaction that might subvert the whole idea of the bot as hardware programmable – as a tactile toll that promote immediate engagement. You have to click through multiple dimensions of interface with the result that the bot may as well require programming at the computer.
A persistent question
These questions of tactility and immediacy in relation to the digital extend well beyond this project…
In the end I can see a compromise somewhere between the two where and array of 9 rotary encoders operate as controls over nine functions. But should those functions be embedded in the hardware interface as particular shapes or should they be a sequence of assignable functions? The latter makes more sense programmatically but in terms of tactility and immediacy it appears to necessarily add another layer of abstraction between hardware interface and program.