Nowadays, everybody seems to know what an algorithm is. I personally got a diploma in computer science, so I learned some things about algorithms, how to write them, to use them… But nothing about the history of the concept.
Who First Invented Algorithm?
Before answering, let’s make something clear: what is an algorithm? The Cambridge Dictionary defined the word as such:
A set of mathematical instructions or rules that, especially if given to a computer, will help to calculate an answer to a problem.
Today, algorithms are essential to the way computers process data, and people seem to equate them with artificial intelligence, but it’s not the same thing. In fact, algorithms are not even limited to computer programming, being simply of mathematical nature.
Truth is, mathematicians used algorithms for as long as we can tell. But it was not called that way at first. We are really here to talk about the origin of the word “Algorithm.”
Muhammad ibn Mūsā al’Khwārizmī was a Persian polymath from Khwarazm during the ninth century. A really influential one whose works in mathematics, astronomy, and geography change the way we are doing science. This astronomer—who also was at one time the head of the library of the House of Wisdom in Baghdad—wrote the book “The Compendious Book on Calculation by Completion and Balancing” which is considered to be the “first true algebra text.” When the book was translated into Latin in 1145 and was published under the title “Liber algebrae et almucabala.” That’s where the word “algebra” came from.
As for “algorithm,” it’s also a Latinized version of a word or, to be precise, of a name. Yes, the Latin version of al’Khwārizmī is Algoritmi (that naturally became modernized as “algorithm” later on).
Muhammad ibn Mūsā al’Khwārizmī other famous book was the “Book of Indian computation.” This text described algorithms on decimal numbers that could be carried out on a dust board—and would be used for almost three centuries. This work was so brilliant, it gradually replaced the previous abacus-based methods used in Europe.
Basically, the term “algorithm” came from the technique developed by al’Khwārizmī of performing arithmetic with Hindu-Arabic numerals.
The First Computer Algorithm
The first computer was invented long before it could be built. It was the famous Difference engine imagined by English polymath Charles Babbage in the 1820s. In 1837, he introduced the next concept, the Analytical Engine.
Babbage used to collaborate with Ada Lovelace—the only legitimate child of the poet Lord Byron and mathematician Lady Byron—who was a brilliant mathematician. She realized that the Analytical Engine could be capable of doing more than pure calculation and demonstrated it by publishing (in 1843) the first algorithm that could be used on a computer.
Of course, all of this was theoretical, not practical. That changed with Alan Turing, the famous English mathematician and computer scientist.
Turing also worked on the theoretical when he introduced the notion of computation by machines, in 1936, when working on the “Halting problem.” He then formalized the concepts of algorithm and computation with the Turing machine, a model of a general-purpose computer. The Turing machine built the foundations for the modern computer.
On the practical level, during World War II, he designed the Bombe, an electromechanical machine that searched through the permutations possible to crack the Enigma code used by the Germans. With it, the British were able to decrypt all the Naval communications the Germans thought secrets.
Computers have evolved a lot since then, and algorithms did too, and continue to do so.
To learn more about Lovelace and Babbage in a fun way, I recommend the graphic novel “The Thrilling Adventures of Lovelace and Babbage: The (Mostly) True Story of the First Computer” by Sydney Padua (Pantheon Graphic Library).