@rosemadr Asked a really interesting question last session that I thought was worth a second look.
At first I think I was a little quick to paint computation as completely incompatible with standpoint theory, post normal science, new materialism etc. However on a re-visitation to that thought I wonder whether there is something that can be done here…
My ideas started around New Materialism, as when you first start interrogating this question one of the first things I settled on is that digitisation is fundamentally built on binary data - bit is either a 0 or a 1. But this supposition itself is built on top of clear divisions of signals into binary categories.
Now there are alternate bases for computation (like ternary and quaternary). These have largely been unexplored due to the added complexity of building components with more complex requirements for operation. But even this would be putting arbitrary ‘cuts’ on the way signals are processed, and computation built on this would seem to need a similar level of rigid categorical assumptions.
To truly distance ourselves from architecture build on rigid categorisation I think we’d need to start at the level of analogue computers… in my head I’m seeing old school synthesiser banks and patch bays.
Is there a reason to go back to analogue? Possibly… For instance digital TV can send a lot more data, but it is also either ‘on’ or ‘off’, interference in the signal causes tears and stuttering in that makes it unusable. Noise in an analogue broadcast, however, whilst resulting in interference patterns, remains usable under far more adverse conditions. How this might apply to computer programming is something I’m less sure of, but I thought I’d share this as an interesting idea.