Computation
What computation actually is — and why it might be the most fundamental concept in science, from physics to biology to mind.
“We’re presently in the midst of a third intellectual revolution. The first came with Newton: the planets obey physical laws. The second came with Darwin: biology obeys genetic laws. In today’s third revolution, we’re coming to realize that even minds and societies emerge from interacting laws that can be regarded as computations. Everything is a computation.”
— Rudy Rucker
What is computation?
Computation is, in essence, calculation: the transformation of one or more inputs into one or more outputs. That sounds vague, because it is. The generality is the point.
Consider how we decide who should be Prime Minister. We transform a variety of inputs — trustworthiness, proposed policy changes, track record — into an output: probability of receiving a vote. Those inputs are themselves computed: trustworthiness arises from reputation, known associates, previous conduct. The whole thing is inputs becoming outputs becoming inputs, all the way down.
Computation doesn’t require numbers. It requires symbols — which is why the technical term is symbol processing. Because almost anything can be represented symbolically, computers can process speech, images, text, sensor data, and social behaviour, not just arithmetic. The abstraction is intentional: a computer is any automaton capable of processing symbol structures.
The “automaton” part is important. For something to count as a computer, it must be able to execute a series of steps without being walked through each one manually. The instructions are embedded in the system in advance. This is what distinguishes a computer from a calculation performed by hand.
A concrete example
A finite state machine is one of the simplest computational abstractions. It has a set of possible states, a starting state, and rules for transitioning between states based on input.
Consider a machine with two states — call them S1 and S2 — and two possible inputs: 0 and 1.
S1 + input 0 → S1
S1 + input 1 → S2
S2 + input 0 → S2
S2 + input 1 → S1
Starting at S1, feed it a string of binary digits. Where it ends tells you something non-trivial: if it finishes in S1, the string contained an even number of 1s; if it finishes in S2, an odd number. No one counted. The counting was implicit in the structure.
This is what computers do, scaled enormously. They do basic things — transform inputs, remember results — only very fast and in very large numbers.
Managing complexity: abstraction
A modern computer does not feel like a finite state machine. That’s because it isn’t one system — it’s a hierarchy of systems, each built on top of the last. Transistors become logic gates; logic gates become circuits; circuits become processors; processors become operating systems; operating systems become the applications you use. Each layer performs a specific function and sits inside something with a different function.
Biological organisms work the same way. You have a cardiovascular system. Within it, a heart. Within that, valves. The valve’s function is distinct from the heart’s function, which is distinct from the cardiovascular system’s function — yet they are not in conflict. They are at different levels of the same hierarchy.
Moving deliberately between these levels is what computer scientists call abstraction. In practice it means: I don’t care how it does it; only what it does. When you call a function that sorts a list, you don’t need to know whether it uses merge sort or quicksort. You know what it returns. The implementation is hidden, and correctly so — it frees your attention for the level that matters right now.
This matters beyond programming because human working memory is finite. At any moment you can hold and manipulate roughly four chunks of information. Abstraction is a technique for managing that constraint: once you understand a component well enough to trust it, you can treat it as a single chunk and stop thinking about its internals. The experienced economist who hears “scarce resource” doesn’t mentally reconstruct the definitions — the whole concept is one chunk, leaving room for the new material on top.
Learning to move fluidly up and down levels of abstraction — zooming out to see the structure, zooming in to examine the mechanism — is not just a programming skill. It is a thinking skill. It is how any sufficiently complex domain becomes manageable.
Computation is inputs becoming outputs, according to rules, automatically. Computers are machines that do this with symbols at scale. And the way we reason about them — the way we reason about anything complex — is by learning which details to ignore and when.