With the onset of cloud computing and the Internet of Things, computation became a tacit infrastructure of everyday life. With biocomputing, the computational capacities themselves may spill into the environment - it may no longer make sense to talk about discrete objects endowed with computational capacities, such as “computers” or “smartphones”. Instead, biocomputing promises a world where computation becomes the ambient property of our surroundings. So what comes after the cloud?
The open sea of the internet? Unlikely. The dark forest?[1] Maybe. We guess that with the Cambrian explosion of biological substrates, the future hides in the warm, comforting bowl of computational soup. TT-100 - the first DNA computer - was a laboratory test tube filled with 100 ml of DNA solution. Twenty years from now, you may also be casually uploading a spreadsheet into something resembling a hotpot. The wet, liquid, flubber-ish nature of biocomputational substrates is here to stay.[2]
Unlike a good old-fashioned computer with discrete, separated parts, a cell, a microorganism or just a DNA macromolecule floating in a test tube has a way more porous and fluid nature. In the environments, computational elements bump into each other, exchanging information through chemical messaging in a liquid environment. That comes with interference, noise, and heterodox mixtures. Hence, together with the soup, one of the metaphors that can be used to describe the material features of biocomputing is another popular food: burrito. In biological environments, things are smashed on each other, creating contiguous layers of information processing. Unlike the semiconductor circuits - that diagram the flow of information in the electron labyrinths carved out on the surface of microchips - a biological information circuit is more of an abstraction - in reality, it is seldom isolated from its environment.
The age of souper-computing also transforms the way we may see information networks in general. Today, the dominant metaphor of cloud computing suggests a split between two distinct layers: the peripheral devices, such as smartphones or PCs, and the ephemeral background (the cloud) which stores most of the data, and where the real computational heavy-lifting takes place. The cloud is however physically located in remote, high-capacity, energy-intense datacenters, rapidly eating out land and resources.
Here comes one of the major implications of biocomputing. Biocomputing is energy-efficient, and combined with parallel processing capacity or abundant storage space, it enables network designs that stir and mix the cloud by throwing the data centers with the peripheral devices into one big bowl. Shake well, and you’ll get the computational soup; the ambient computational goo, hyper-local, and universal at the same time.
Hence, the practical consequences of computation becoming soup-like are twofold:
As for computational onboarding, it is important to note that, unlike the good old-fashioned computers, the biological wetware is highly modular - you may always blend in extra storage or computing power, or change the properties of the overall network. It is as customisable as a good hotpot may get. In domestic settings, a computational soup can have a form of ambient computational medium emerging out of the mixture of local biocomputational agents, e.g. the fytostats in the rooms plus the organoid processing units stored in your fridge plus living roof that is actually a mixture of soil, moss and DNA storage ooze. There is no central processing unit strictly speaking, or some main storage - the computational tasks come and go as tides through the whole network, breaking down to redundant copies that make the network as a whole incredibly robust.
Finally, computational soup invites us to rethink interface design. Just as you can modify your soup with extra ingredients - a bit of lettuce here, a bit of tofu there - the logic of addition, subtraction, and the dynamic transformation of properties of the whole by changing the parts can be valorized in the design of computational elements. Think about Physarum polycephalum (commonly known as slime mold), which is capable of smart decision-making, such as figuring out the most efficient route between multiple points in space. This being - itself oscillating on the threshold between organic and inorganic - computes by its body: it expresses the results of computational processes by changes in its morphology. Andrew Adamatzky, one of the pioneers of slime mold computing, calls this morphological computation. While currently, practical applications of slime mold computing reside mostly in the world of hypotheticals, the very idea of change of the shape as the expression of computation paves the way to a speculative possibility of morphological interfaces: biological surfaces that get programmed and exhibit their functions by modifications in their structure, whether by direct contact with human users or by the surface’s intrinsic dynamics.[3]