February 5, 2018

From Assembler to Ackermann

My first computer science semester in Berlin was about 17 years ago. The most vivid memory from that time was the subject of Theoretical Computer Science held by Prof. Dirk Siefkes. He taught us about formal languages, Turing machines and the nature of computation. The passion for his field and his students was remarkable. Back then his enthusiasm was inspiring me and a bit of it even lasts until today.

The last claim demands a proof. Well, let me convince you with the rest of this blog post.


We use computational models like Turing machines and λ-calculus to understand the nature of computation. Every undergraduate computer scientist (hopefully) knows that all Turing-complete models and programming languages are essentially equivalent and that any "effectively computable" function can be computed by these models. On the other hand, the halting problem tells us that termination of a Turing machine with a given input is not decidable. In an imperative programming language, this undecidability of termination stems from while-loops. So-called primitive recursive functions on the other hand essentially describe for-loops and are therefore guaranteed to halt. But is there also something in the middle, something between a for-loop and a while-loop?

LODA stands for Lexicographical Order Descent Assembly and is both a minimalistic programming language and a computational model. It consists of only four types of instructions (five if we count syntactic sugar). LODA programs operate on an unbounded sequence of memory registers $0, $1, $2, ... where each of them stores a natural number. The operands of instructions are either constants, direct memory accesses or indirect memory access. There are only two types of arithmetic operations: "add" and "sub". The latter is maybe not exactly what you would expect: since we allow only natural numbers, subtracting a larger value from a smaller value yields 0. Using these two operations, we can also define the assignment instruction "mov X,Y" which overrides the register X with the value of Y. It can be defined simply as "sub X,X" and "add X,Y".

The most interesting part is of course the model of loops in LODA. The language supports so-called "lexicographical order descent loops," which we write using the instructions "lpb X,Y" and "lpe". The interesting thing about these loops is that they are guaranteed to stop for every input and yet that they are more powerful than primitive recursion. This can be shown by implementing the Ackermann function with it.

A somewhat easier example of a LODA program for computing the Fibonacci numbers is shown below. If you are curious how this works and why the expressive power of LODA is "between" for-loops and while-loops, check out the documentation and cool C++ implementation at GitHub.