The Wonderful World of Early Computing

The history of computing spans thousands of years - from the primitive notched bones found in Africa, to the invention of abacus in 2400 BC, to Charles Babbage's Difference Engine in 1883, to the rise of the popularity of Personal Computers (PCs) in the 1970s. For the most part, this timeline is marked by devices that bear little or no resemblance to present-day machines both in form and capabilities.

We've had many posts on Neatorama about the newest and greatest in computers and technology. But for this article, let's go back - way back - and take a look at the wonderful world of early computing.

Lebombo and Ishango Bones

The Lebombo bone is a 35,000-year-old baboon fibula discovered in a cave in the Lebombo mountains in Swaziland. The bone has a series of 29 notches that were deliberately cut to help ancient bushmen calculate numbers and perhaps also measure the passage of time. It is considered the oldest known mathematical artifact.

Ishango bone (Photo: AfricaMaat)

The unusual groupings of the notches on the Ishango bone (see above), discovered in what was then the Belgian Congo, suggested that it was some sort of a stone age calculation tool. The 20,000-year-old bone revealed that early civilization had mastered arithmetic series and even the concept of prime numbers.


Today, abacus is mostly synonymous with the Chinese suanpan version, but in actuality it had been used in Babylon as early as 2400 BC. The abacus was also found in ancient Egypt, Greece, and Rome. Even the Aztecs had their own version.

The Roman pocket abacus was the first portable calculating device, presumably invented to help tax collectors do math while on the go!

Antikythera Mechanism

In 1900, a Greek sponge diver spotted a shipwreck off the coast of the tiny island of Antikythera. Little did he know that amongst the jewelry and statues recovered from the wreck, the most precious item would be a lump of green rock with gears sticking out of it.

The "rock" turned out to be the earliest example of analog computer: an intricate mechanism with more than 30 gears and writings that scientists thought was used to calculate the motion of the sun and the moon against a background of fixed stars.

The Antikythera Mechanism, as the device was named, was dated from around 100 BC. It would take about another 1,000 years for the appearance of similar levels of technical sophistication in the West. Who built the machine and why the technology was lost remained a mystery.

Napier's Bones

In 1614, Scottish mathematician John Napier proposed a radical idea called logarithm that made calculations by hand much easier and quicker. (That wasn't his only contribution to math: Napier was a big proponent of the decimal point, which wasn't much in use until he came around.)

He also created a device, called Napier's bones, that let people perform multiplications by doing a series of additions (which was a lot easier to do) and divisions as a series of subtraction. It could even do square and cube roots! This invention may seem trivial to you and me, but it was a significant advancement in computing at the time.

Wilhelm Schickard's Calculating Clock

In 1623, Wilhelm Schickard of the University of Tübingen, Württemberg (now part of Germany), invented the first mechanical calculator. Schickard's contemporaries called the machine the Speeding Clock or the Calculating Clock.

Schickard's calculator, which was built 20 years before Blaise Pascal and Gottfried Leibniz's machines, could add and subtract six-digit numbers (with a bell as an overflow alarm!). This invention was used by his friend, astronomer Johannes Kepler, to calculate astronomical tables, which was a big leap for astronomy at the time. For this, Wilhelm Schickard was considered by some to be the "Father of Computer Age."

Wilhelm Schickard died of the Bubonic Plague in 1635, thirteen years after inventing the world's first mechanical calculator. The prototype and plans for the calculator was lost to history until the 20th century, when the machine's design was discovered among Kepler's papers.

In 1960, mathematician Bruno Von Freytag constructed a working model of Schickard Calculator from the plans. (Image: Institut für Astronomie und Astrophysik, Universität Tübingen)

Blaise Pascal's Pascaline

The second mechanical calculator, called the Pascaline or the Arithmetique, was invented in 1645 by Blaise Pascal. Pascal started working on his calculator when he was just 19 years old, out of boredom. He created a device to help his father, a tax collector, to crunch numbers.

In 1649, Pascal received a Royal Privilege giving him the exclusive right to make and sell calculating machines in France. However, because of the complexity of his machine and its limitation (the Pascaline could only add and subtract, and frequently jammed), he managed to sell just a little over a dozen.

Blaise Pascal's Pascaline (Photo: WU Wien)

The basic mechanism of the Pascaline is a series of gears - when the first gear with ten teeth made one rotation (one to ten), it shifts a second gear until it rotated ten times (one hundred). The second gear shifted a third one (thousands) and so on. This mechanism is still in use today in car odometers, electricity meters and at the gas pumps.

Leibniz' Stepped Reckoner

Eleven years after Pascal's death, German mathematician Gottfried Wilhelm Leibniz was inspired by a steps-counting machine (pedometer) he saw to build his own calculator.

Leibniz's design used a special type of gear called the Stepped Drum or Leibniz wheel, a cylinder with nine bar-shaped teeth along its length. He named his machine the Staffelwalze or the Stepped Reckoner.

The machine was a marked improvement from Pascal's design and could add, subtract, multiply, divide, and even evaluate square roots by a series of additions. (Photo: calculmecanique)

Leibniz' Stepped Reckoner (Photo: KerryR)

Despite his genius, Leibniz was so out of favor (he picked a fight with Sir Isaac Newton on who invented calculus) that when he died, his grave went unmarked for 50 years!

The Jacquard Loom

In 1801, straw hat maker and inventor Joseph Marie Jacquard created a punch-card controlled loom that enabled one person to produce fabric in a fraction of the time it would take a traditional silk weaver to make.

The pattern of holes in the card determined which weaving rods could pass through. This, in turn, made the pattern on the fabric.

When he unveiled his invention at an industrial exposition in Paris, traditional silk weavers took to the street to protest the threat to their livelihood.

Jacquard Loom (Photo: Computer Desktop Encyclopedia)

Thomas de Colmar's Arithmometer

Since the invention of Schickard's calculator, it took nearly 200 years for calculators to become commercially successful.

In 1820, Charles Xavier Thomas de Colmar, a French mathematician, created the first commercially successful mechanical calculator. The machine, which used stepped cylinder invented by Leibniz, was called the Arithmometer. It could add, subtract, multiply (and with some user intervention, divide) and was the calculator of choice for nearly a hundred years. (Photo: Popular-science Museum, The Hague, Netherlands)

Charles Babbage's Difference Engine

In the early 1800s, numerical tables, such as for polynomial functions, were routinely calculated by humans. They were actually called "computers" (meaning "one that computes"). Understandably, this process was filled with human errors.

In 1822, an eccentric British mathematician and inventor named Charles Babbage proposed a machine called the Difference Engine to calculate a series of mathematical values automatically.

Babbage started to build his first engine, which was composed of around 25,000 parts, weighed 15 tons (13,600 kg) and stood 8 feet (2.4 m) high. It was never completed, and Babbage left to pursue another idea, a more complex Analytical Engine, which could be programmed using punch cards - an idea far beyond his time.

Part of Babbage's first Difference Engine, assembled by his son after
his death using parts found in his workshop. (Photo: Andrew Dunn [wikipedia])

Babbage's Difference Engine was considered one of the first mechanical computers. Despite of its unwieldy design, his plan called for a basic architecture very similar to that of a modern computer.

Charles Babbage's Difference Engine No. 2, with Doron Swade of
The Science Museum who oversaw its construction.
(Photo: Science Museum/Science & Society Picture Library)

In 1985, The Science Museum in London started a project to build an actual Difference Engine (actually an updated version that Babbage designed in 1849) and a printer (also designed by the guy). The calculation section of the Engine alone consisted of 4,000 parts (this more sophisticated design called for a third of the parts required for the first version of the Engine) and weighed 2,600 kg (5,700 lb). It was completed and working in November 1991. Impressively, the machine was accurate to 31 decimal places!

Interesting fact: one of the reasons that Babbage never completed his Difference Engine was that he couldn't help but to continuously tinker with and improve the design (he came up with the idea for the Analytical Engine even before he could build the Difference Engine). This was probably the first recorded instance of feature creep.

George Boole Invented Boolean Algebra

In 1847, self-taught British mathematician George Boole invented a branch of algebra that dealt with logic. In Boole's system, logic operations can be boiled down to three steps: union (OR), intersection (AND), and complementation (NOT). Boole's idea was brilliant - but at the time, it was criticized and completely ignored by his contemporaries.

It was not until almost 100 years later that in 1938, engineer Claude Shannon realized that Boolean algebra could be applied to two-valued electrical switching circuits. Shannon's work, and by extension, that of George Boole, became the foundation of modern day digital circuit design and was the basis for all digital electronics.

Konrad Zuse's Z3 Computer

In 1941, despite financial hardship and isolation from computer scientists from other Western countries, German computer pioneer Konrad Zuse created the world's first programmable computer, the Z3, from spare telephone parts.

The Z3 uses 2,000 relays (an electric switch) and was used to design aircrafts. Zuse's request to create an electronic successor for the machine was denied by Germany as "strategically unimportant." The original Z3 was destroyed in air raid of Berlin, so Zuse built a fully-functioning replica later in the 1960s.

Konrad Zuse's Z3 Computer (photo: Computer History Museum)

Neat facts: Zuse also created the world's first high-level programming language, called Plankalkül, and founded the first computer startup company in 1946.

Bombe and Colossus: Cracking Nazi Codes During World War II

During World War II, Nazi Germany used an electro-mechanical cipher machine called Enigma to encrypt and decrypt coded messages. It used rotors to substitute letters (for example, an "E" might be coded as "T"). The genius of the Enigma was that the machine used polyalphabetic cipher, where the rotation of the rotors allowed each subsequent letters to be encoded in a different manner. (For example, "EEE" might be become "TIF").

Enigma Machine at the Imperial War Museum, London (Photo: Karsten Sperling) and its rotors (Photo: More Enigma photos at

Obviously, the Allies were very interested in breaking the Nazi codes. Before the war, Polish cryptographer Marian Rejewski had devised a machine called the Bomba kryptologiczna (or simply "Bomba" or "bomb") to break Enigma. The machine was clunky and cumbersome, not to mention loud like a bomb (hence the name), but it worked!

In 1939, Britain intelligence set up the Government Code and Cypher School under the codename "ULTRA" at Bletchley Park (then known as Station X), 50 miles north of London. Bletchley Park workers included a motley group of mathematicians, computer scientists, and even crossword experts and chess champions. Amongst them was Alan Turing, who later became known as the father of modern computer science.

To break enigma codes, Turing, along with mathematician Gordon Welchman, devised the "Bombe," an electromechanical machine based on Rejewski's earlier design. The Bombe allowed the Allies to routinely decode the bulk of enemy's encrypted communication.

Turing and Welchman's Bombe (Photo: La bombe de Turing)

Although Enigma was used by the field units, Nazi high command used a more secure system called the Lorenz SZ 40/42 cipher machines. The Allies nicknamed these machines "Fish" and their cipher traffic as "Tunny."

A team headed by Tommy Flowers designed and built a special-purpose vacuum tube-based computer called Colossus to decrypt Tunny traffic. An operator would feed cipher text on a 5-bit paper tape, which the machine would read at an impressively fast (imagine a paper tape speeding along) 5,000 characters per second. A total of 11 Collosi were built for the war effort.

Colossus Mark II (Photo: Public Record Office, London)

After the war, Churchill ordered Bletchley Park to be closed and all of the Colossus computers destroyed into "pieces no bigger than a man's hand" and its blueprints burned. Indeed, the project was so secret that the contributions of Flowers and his colleagues weren't recognized for many years after the war.

In 1994, a team led by computer scientist Tony Sale (L) began rebuilding the Colossus. (Photo: MaltaGC [wikipedia])

Harvard Mark I

When he was working on his doctoral thesis in physics, Howard H. Aiken ran into a problem - he needed numbers for his theory of space-charge conduction in vacuum tubes, but the problem was too complex for calculators of the day. The solution was obvious: build a bigger calculator!

In 1943, Aiken and IBM created what is now considered the first universal calculator: a 51 ft (16 m) long, 8 ft (2.4 m) tall and 2 feet (0.6 m) deep machine weighing about 10,000 pounds (4,500 kg) called the Harvard Mark I. At the time, the machine was unbelievably fast: it could do 3 calculations per second!

Harvard Mark I in action. (Photo: Computer History Museum)

Harvard Mark I was built out of relays, switches, clutches, and rotating shafts. It has over 765,000 parts, 3,300 relays, 175,000 connections and over 500 miles (800 km) of wire. A physicist named Jeremy Bernstein once visited Aiken's work and remarked that the machine made noise "like a roomful of ladies knitting."

Harvard Mark I (Photo: IBM Archives)

ENIAC: World's First Electronic Digital Computer

The ENIAC in the Army's Ballistic Research Laboratory. (L: Glen Beck, R: Frances Holberton) Credit: K. Kempf "Historical Monograph: Electronic Computers Within the Ordnance Corps," U.S. Army Photo.

The birth of the world's first electronic digital computer was ushered ... by war. In 1943, on the eve of World War II, the US military realized that they needed help calculating artillery firing tables, a compilation of ballistic weapon settings. So, they contacted John Mauchly and J. Presper Eckert at the University of Pennsylvania to develop ENIAC (Electrical Numerical Integrator and Calculator, nicknamed "Eny").

Completed in 1946, ENIAC was a behemoth of a computer: it measured 8.5 feet by 3 feet by 80 feet (2.6 m x 0.9 m x 26 m), covered an area of 680 sq. feet (167 m2) and weighed 27 tons. The complex machine contained 17,468 vacuum tubes, 7,200 crystal diodes, 70,000 resistors, 10,000 capacitors, 1,500 relays, 6,000 manual switches, and over 5 million hand-soldered joints. The machine was so power-hungry - it required 150 kilowatts of electricity - that it was rumored that when ENIAC was turned on, Philadelphia suffered brownouts! (Well, not really, but it made for a good story.)

When they were building the machine, Mauchly and Eckert knew that mice would be a problem, so they put samples of all the wires that were available at the time inside a cage with a bunch of mice to see which wire insulation the critters didn't like! They only used wires that passed the "mouse test."

Two women operating the ENIAC's main control panel (L: Betty Jennings, R: Frances Bilas) Credit: U.S. Army Photo.

At the time, the ENIAC was definitely fast: it could calculate 5,000 additions, 357 multiplications or 38 divisions in one second - a thousand time faster than any other calculating machines of the time.

World War II ended before the ENIAC was completed. Nevertheless, the military continued to support the project (the first ENIAC calculations were for a hydrogen bomb project).

The invention of ENIAC was a watershed moment in computing history. It was the computer that proved electronic digital computing was possible. Indeed, many computer scientists regard that there are two epochs in computer history: Before ENIAC and After ENIAC. We owe the birth of the first modern electronic computer to war spending!

Note: Though Colossus was constructed before ENIAC, it wasn't "Turing complete" (Colossus has a specialized function and couldn't be used for general calculations like ENIAC). Also, the existence of Colossus was kept secret until the 1970s, well after the birth of ENIAC. The Zuse Z3, on the other hand, was Turing complete but it wasn't electronic.

In an ensuing legal battle to break ENIAC's patent, another machine called the Atanasoff-Berry Computer (or ABC) was deemed by the courts as the first computer. Most computer scientists, however, didn't consider this legal decision as scientifically correct: ABC wasn't programmable and wasn't Turing complete, so they still considered ENIAC to be the first true computer.

World's First Computer "Bug"

On September 9, 1945, U.S. Navy officer Grace Hopper found the first computer "bug": a moth stuck between the relays on the Harvard Mark II (successor to the Mark I above) She noted it on her log as the "first actual case of bug being found." Though the term "bug" had meant a computer error beforehand, it became a popular term after this incident.

Hopper went on to create the first compiler for a computer programming language (the A-0 System for the UNIVAC in 1952) and worked on the development of COBOL, one of the earliest high-level programming languages that allowed programmers to use words instead of machine codes. To acknowledge her contributions, the U.S. Navy named a ship after her (it's a guided missile destroyer, by the way).

Even if you've never heard of Grace Hopper before reading this article, chances are you've heard one of her famous quotes: "It's easier to ask forgiveness than it is to get permission."

I'll be the first to admit that this post - long as it is - doesn't give a complete picture of the development of early computers. We haven't talked about the contributions of Nikola Tesla (he invented the electromechanical AND gate), Vannevar Bush (who brought analog computing to new heights with his differential analyzer and who pioneered the concept of memex, a theoretical idea similar to the World Wide Web) the invention of vacuum tubes, and so on.

After ENIAC, there was an explosive growth in computer science. In 1947, three Bell Lab engineers - William Shockley, John Bardeen and Walter Brattain, sparked a revolution in computing by inventing the transistor. Computers went commercial as mainframes, and later, became "personal" as they become smaller and faster.

One thing is for sure: we're living in the golden age of computer. Computing power continually becomes faster and faster, and we're finding new ways to use computers in our daily lives.

Previously on Neatorama: The Wonderful World of Early Photography

Newest 5
Newest 5 Comments

Login to comment.

Email This Post to a Friend
"The Wonderful World of Early Computing"

Separate multiple emails with a comma. Limit 5.


Success! Your email has been sent!

close window

This website uses cookies.

This website uses cookies to improve user experience. By using this website you consent to all cookies in accordance with our Privacy Policy.

I agree
Learn More