Science

Gottfried Wilhelm Leibniz: How His Binary Systems Shaped the Digital Age

Zeroes and ones that changed the world. 

by Oscar Gonzalez
Google

Binary code is the language of computers and electronic devices. The use of binary numbers date back to ancient Egypt, but it was 17th-century philosopher and mathematician, Gottfried Wilhelm Leibniz, who created the current binary number system used today. Google celebrated his 372nd birthday on Sunday with a Google Doodle showing binary numbers.

Born on July 1, 1646, Leibniz made great strides in the fields of philosophy and math. He developed a form of calculus around the same time as Sir Isaac Newton and was considered one of the greatest 17th-century philosophers of rationalism. He was also an inventor, coming up with variations of the mechanical calculator. He then invented the modern binary number system in 1689 as a way to convert verbal logic statements into mathematical ones, and he used only zeros and ones.

 Gottfried Wilhelm Leibniz

Wikimedia / Ad Meskens

Leibniz wrote his system in an article called “Explication de l’Arithmétique Binaire” or “Explanation of the Binary Arithmetic” in 1703. In the article, he showed how the use of zeroes and ones could represent numbers. Leibniz expressed how simple the code is in the article. He wrote, “All these operations are so easy that there would never be any need to guess or try out anything.”

At the time, Leibniz’s binary system was more of a way to combine his philosophy and religious beliefs with math, and didn’t have much purpose. That changed when the first computers were developed in the early 20th century around the time of World War II. These early computers needed a limited language to control its functions so early computer scientists used binary to represent on and off or ones and zeroes.

As computers became more sophisticated, binary code became the most used language. Leibniz’s development of the code set the foundation to bring forth the Digital Age almost 300 years before.

With modern computers continuing to improve, researchers strive to go beyond computers knowing only binary code, and instead, try to make them more like the human brain.