It was the middle of the second world war, Europe was under siege and while the allied forces were able to intercept radio traffic communication, they could not make any sense of the information, since it was encrypted. Germany had developed an ingenious device, called the Enigma, which allowed for millions of possible encryption combinations. Being able to decrypt that communication would no doubt be of tremendous value to the allied forces. One day, a man stepped into the offices of Britain’s code breaking centre at Bletchley park, claiming he would be able to build a machine that could decrypt the enemy’s communication. It was Alan Turing, a mathematician, cryptanalyst and theoretical biologist.

In his research, Alan Turing laid the foundation of modern information technology and software development. Especially nowadays, with the rise of artificial intelligence (AI), the Turing test is a widely spread concept to test if an implementation of AI is as intelligent as a human. But Alan Turing did a lot more than that. He also described what we now call the Turing Machine, the simplest possible implementation of a computer. A Turing Machine — Alan Turing never actually built one, but others have — consists of a reader capable of reading zeros and ones on a long band of paper. The machine has a memory and it can change directions (reading from left to right and back) based on the instructions (the bits on the paper roll) that it reads. Long story short, Alan Turing mathematically proved that this machine is theoretically capable of executing (almost) any algorithm, or what we now call a “program”.

What does this really mean ? It means that Alan Turing — remember, this is before modern computers actually existed — proved that software would be capable of doing all the things we need it to do, and that all these things can be broken up into smaller pieces until you end up with a program, written down as ones and zeros, on a very very long sheet of paper.

And that’s essentially what the whole IT world did. We built computers, that still today work with ones and zero’s inside their chips. These bits are combined to make characters, characters become instructions and instructions become a computer program that we compile into an application.

Continue reading on Medium