The Wonderful World of Early Computing

The history of computing spans thousands of years - from the primitive notched bones found in Africa, to the invention of abacus in 2400 BC, to Charles Babbage's Difference Engine in 1883, to the rise of the popularity of Personal Computers (PCs) in the 1970s. For the most part, this timeline is marked by devices that bear little or no resemblance to present-day machines both in form and capabilities.

We've had many posts on Neatorama about the newest and greatest in computers and technology. But for this article, let's go back - way back - and take a look at the wonderful world of early computing.

Lebombo and Ishango Bones

The Lebombo bone is a 35,000-year-old baboon fibula discovered in a cave in the Lebombo mountains in Swaziland. The bone has a series of 29 notches that were deliberately cut to help ancient bushmen calculate numbers and perhaps also measure the passage of time. It is considered the oldest known mathematical artifact.


Ishango bone (Photo: AfricaMaat)

The unusual groupings of the notches on the Ishango bone (see above), discovered in what was then the Belgian Congo, suggested that it was some sort of a stone age calculation tool. The 20,000-year-old bone revealed that early civilization had mastered arithmetic series and even the concept of prime numbers.

Abacus

Today, abacus is mostly synonymous with the Chinese suanpan version, but in actuality it had been used in Babylon as early as 2400 BC. The abacus was also found in ancient Egypt, Greece, and Rome. Even the Aztecs had their own version.

The Roman pocket abacus was the first portable calculating device, presumably invented to help tax collectors do math while on the go!

Antikythera Mechanism

In 1900, a Greek sponge diver spotted a shipwreck off the coast of the tiny island of Antikythera. Little did he know that amongst the jewelry and statues recovered from the wreck, the most precious item would be a lump of green rock with gears sticking out of it.

The "rock" turned out to be the earliest example of analog computer: an intricate mechanism with more than 30 gears and writings that scientists thought was used to calculate the motion of the sun and the moon against a background of fixed stars.

The Antikythera Mechanism, as the device was named, was dated from around 100 BC. It would take about another 1,000 years for the appearance of similar levels of technical sophistication in the West. Who built the machine and why the technology was lost remained a mystery.

Napier's Bones

In 1614, Scottish mathematician John Napier proposed a radical idea called logarithm that made calculations by hand much easier and quicker. (That wasn't his only contribution to math: Napier was a big proponent of the decimal point, which wasn't much in use until he came around.)

He also created a device, called Napier's bones, that let people perform multiplications by doing a series of additions (which was a lot easier to do) and divisions as a series of subtraction. It could even do square and cube roots! This invention may seem trivial to you and me, but it was a significant advancement in computing at the time.

Wilhelm Schickard's Calculating Clock

In 1623, Wilhelm Schickard of the University of Tübingen, Württemberg (now part of Germany), invented the first mechanical calculator. Schickard's contemporaries called the machine the Speeding Clock or the Calculating Clock.

Schickard's calculator, which was built 20 years before Blaise Pascal and Gottfried Leibniz's machines, could add and subtract six-digit numbers (with a bell as an overflow alarm!). This invention was used by his friend, astronomer Johannes Kepler, to calculate astronomical tables, which was a big leap for astronomy at the time. For this, Wilhelm Schickard was considered by some to be the "Father of Computer Age."

Wilhelm Schickard died of the Bubonic Plague in 1635, thirteen years after inventing the world's first mechanical calculator. The prototype and plans for the calculator was lost to history until the 20th century, when the machine's design was discovered among Kepler's papers.

In 1960, mathematician Bruno Von Freytag constructed a working model of Schickard Calculator from the plans. (Image: Institut für Astronomie und Astrophysik, Universität Tübingen)

Blaise Pascal's Pascaline

The second mechanical calculator, called the Pascaline or the Arithmetique, was invented in 1645 by Blaise Pascal. Pascal started working on his calculator when he was just 19 years old, out of boredom. He created a device to help his father, a tax collector, to crunch numbers.

In 1649, Pascal received a Royal Privilege giving him the exclusive right to make and sell calculating machines in France. However, because of the complexity of his machine and its limitation (the Pascaline could only add and subtract, and frequently jammed), he managed to sell just a little over a dozen.


Blaise Pascal's Pascaline (Photo: WU Wien)

The basic mechanism of the Pascaline is a series of gears - when the first gear with ten teeth made one rotation (one to ten), it shifts a second gear until it rotated ten times (one hundred). The second gear shifted a third one (thousands) and so on. This mechanism is still in use today in car odometers, electricity meters and at the gas pumps.

Leibniz' Stepped Reckoner

Eleven years after Pascal's death, German mathematician Gottfried Wilhelm Leibniz was inspired by a steps-counting machine (pedometer) he saw to build his own calculator.

Leibniz's design used a special type of gear called the Stepped Drum or Leibniz wheel, a cylinder with nine bar-shaped teeth along its length. He named his machine the Staffelwalze or the Stepped Reckoner.

The machine was a marked improvement from Pascal's design and could add, subtract, multiply, divide, and even evaluate square roots by a series of additions. (Photo: calculmecanique)


Leibniz' Stepped Reckoner (Photo: KerryR)

Despite his genius, Leibniz was so out of favor (he picked a fight with Sir Isaac Newton on who invented calculus) that when he died, his grave went unmarked for 50 years!

The Jacquard Loom

In 1801, straw hat maker and inventor Joseph Marie Jacquard created a punch-card controlled loom that enabled one person to produce fabric in a fraction of the time it would take a traditional silk weaver to make.

The pattern of holes in the card determined which weaving rods could pass through. This, in turn, made the pattern on the fabric.

When he unveiled his invention at an industrial exposition in Paris, traditional silk weavers took to the street to protest the threat to their livelihood.


Jacquard Loom (Photo: Computer Desktop Encyclopedia)

Thomas de Colmar's Arithmometer

Since the invention of Schickard's calculator, it took nearly 200 years for calculators to become commercially successful.

In 1820, Charles Xavier Thomas de Colmar, a French mathematician, created the first commercially successful mechanical calculator. The machine, which used stepped cylinder invented by Leibniz, was called the Arithmometer. It could add, subtract, multiply (and with some user intervention, divide) and was the calculator of choice for nearly a hundred years. (Photo: Popular-science Museum, The Hague, Netherlands)

Charles Babbage's Difference Engine

In the early 1800s, numerical tables, such as for polynomial functions, were routinely calculated by humans. They were actually called "computers" (meaning "one that computes"). Understandably, this process was filled with human errors.

In 1822, an eccentric British mathematician and inventor named Charles Babbage proposed a machine called the Difference Engine to calculate a series of mathematical values automatically.

Babbage started to build his first engine, which was composed of around 25,000 parts, weighed 15 tons (13,600 kg) and stood 8 feet (2.4 m) high. It was never completed, and Babbage left to pursue another idea, a more complex Analytical Engine, which could be programmed using punch cards - an idea far beyond his time.


Part of Babbage's first Difference Engine, assembled by his son after
his death using parts found in his workshop. (Photo: Andrew Dunn [wikipedia])

Babbage's Difference Engine was considered one of the first mechanical computers. Despite of its unwieldy design, his plan called for a basic architecture very similar to that of a modern computer.


Charles Babbage's Difference Engine No. 2, with Doron Swade of
The Science Museum who oversaw its construction.
(Photo: Science Museum/Science & Society Picture Library)

In 1985, The Science Museum in London started a project to build an actual Difference Engine (actually an updated version that Babbage designed in 1849) and a printer (also designed by the guy). The calculation section of the Engine alone consisted of 4,000 parts (this more sophisticated design called for a third of the parts required for the first version of the Engine) and weighed 2,600 kg (5,700 lb). It was completed and working in November 1991. Impressively, the machine was accurate to 31 decimal places!

Interesting fact: one of the reasons that Babbage never completed his Difference Engine was that he couldn't help but to continuously tinker with and improve the design (he came up with the idea for the Analytical Engine even before he could build the Difference Engine). This was probably the first recorded instance of feature creep.

George Boole Invented Boolean Algebra

In 1847, self-taught British mathematician George Boole invented a branch of algebra that dealt with logic. In Boole's system, logic operations can be boiled down to three steps: union (OR), intersection (AND), and complementation (NOT). Boole's idea was brilliant - but at the time, it was criticized and completely ignored by his contemporaries.

It was not until almost 100 years later that in 1938, engineer Claude Shannon realized that Boolean algebra could be applied to two-valued electrical switching circuits. Shannon's work, and by extension, that of George Boole, became the foundation of modern day digital circuit design and was the basis for all digital electronics.

Konrad Zuse's Z3 Computer

In 1941, despite financial hardship and isolation from computer scientists from other Western countries, German computer pioneer Konrad Zuse created the world's first programmable computer, the Z3, from spare telephone parts.

The Z3 uses 2,000 relays (an electric switch) and was used to design aircrafts. Zuse's request to create an electronic successor for the machine was denied by Germany as "strategically unimportant." The original Z3 was destroyed in air raid of Berlin, so Zuse built a fully-functioning replica later in the 1960s.


Konrad Zuse's Z3 Computer (photo: Computer History Museum)

Neat facts: Zuse also created the world's first high-level programming language, called Plankalkül, and founded the first computer startup company in 1946.

Bombe and Colossus: Cracking Nazi Codes During World War II

During World War II, Nazi Germany used an electro-mechanical cipher machine called Enigma to encrypt and decrypt coded messages. It used rotors to substitute letters (for example, an "E" might be coded as "T"). The genius of the Enigma was that the machine used polyalphabetic cipher, where the rotation of the rotors allowed each subsequent letters to be encoded in a different manner. (For example, "EEE" might be become "TIF").


Enigma Machine at the Imperial War Museum, London (Photo: Karsten Sperling) and its rotors (Photo: ilord.com). More Enigma photos at wltp.com/enigma

Obviously, the Allies were very interested in breaking the Nazi codes. Before the war, Polish cryptographer Marian Rejewski had devised a machine called the Bomba kryptologiczna (or simply "Bomba" or "bomb") to break Enigma. The machine was clunky and cumbersome, not to mention loud like a bomb (hence the name), but it worked!

In 1939, Britain intelligence set up the Government Code and Cypher School under the codename "ULTRA" at Bletchley Park (then known as Station X), 50 miles north of London. Bletchley Park workers included a motley group of mathematicians, computer scientists, and even crossword experts and chess champions. Amongst them was Alan Turing, who later became known as the father of modern computer science.

To break enigma codes, Turing, along with mathematician Gordon Welchman, devised the "Bombe," an electromechanical machine based on Rejewski's earlier design. The Bombe allowed the Allies to routinely decode the bulk of enemy's encrypted communication.


Turing and Welchman's Bombe (Photo: La bombe de Turing)

Although Enigma was used by the field units, Nazi high command used a more secure system called the Lorenz SZ 40/42 cipher machines. The Allies nicknamed these machines "Fish" and their cipher traffic as "Tunny."

A team headed by Tommy Flowers designed and built a special-purpose vacuum tube-based computer called Colossus to decrypt Tunny traffic. An operator would feed cipher text on a 5-bit paper tape, which the machine would read at an impressively fast (imagine a paper tape speeding along) 5,000 characters per second. A total of 11 Collosi were built for the war effort.


Colossus Mark II (Photo: Public Record Office, London)

After the war, Churchill ordered Bletchley Park to be closed and all of the Colossus computers destroyed into "pieces no bigger than a man's hand" and its blueprints burned. Indeed, the project was so secret that the contributions of Flowers and his colleagues weren't recognized for many years after the war.


In 1994, a team led by computer scientist Tony Sale (L) began rebuilding the Colossus. (Photo: MaltaGC [wikipedia])

Harvard Mark I

When he was working on his doctoral thesis in physics, Howard H. Aiken ran into a problem - he needed numbers for his theory of space-charge conduction in vacuum tubes, but the problem was too complex for calculators of the day. The solution was obvious: build a bigger calculator!

In 1943, Aiken and IBM created what is now considered the first universal calculator: a 51 ft (16 m) long, 8 ft (2.4 m) tall and 2 feet (0.6 m) deep machine weighing about 10,000 pounds (4,500 kg) called the Harvard Mark I. At the time, the machine was unbelievably fast: it could do 3 calculations per second!


Harvard Mark I in action. (Photo: Computer History Museum)

Harvard Mark I was built out of relays, switches, clutches, and rotating shafts. It has over 765,000 parts, 3,300 relays, 175,000 connections and over 500 miles (800 km) of wire. A physicist named Jeremy Bernstein once visited Aiken's work and remarked that the machine made noise "like a roomful of ladies knitting."


Harvard Mark I (Photo: IBM Archives)

ENIAC: World's First Electronic Digital Computer


The ENIAC in the Army's Ballistic Research Laboratory. (L: Glen Beck, R: Frances Holberton) Credit: K. Kempf "Historical Monograph: Electronic Computers Within the Ordnance Corps," U.S. Army Photo.

The birth of the world's first electronic digital computer was ushered ... by war. In 1943, on the eve of World War II, the US military realized that they needed help calculating artillery firing tables, a compilation of ballistic weapon settings. So, they contacted John Mauchly and J. Presper Eckert at the University of Pennsylvania to develop ENIAC (Electrical Numerical Integrator and Calculator, nicknamed "Eny").

Completed in 1946, ENIAC was a behemoth of a computer: it measured 8.5 feet by 3 feet by 80 feet (2.6 m x 0.9 m x 26 m), covered an area of 680 sq. feet (167 m2) and weighed 27 tons. The complex machine contained 17,468 vacuum tubes, 7,200 crystal diodes, 70,000 resistors, 10,000 capacitors, 1,500 relays, 6,000 manual switches, and over 5 million hand-soldered joints. The machine was so power-hungry - it required 150 kilowatts of electricity - that it was rumored that when ENIAC was turned on, Philadelphia suffered brownouts! (Well, not really, but it made for a good story.)

When they were building the machine, Mauchly and Eckert knew that mice would be a problem, so they put samples of all the wires that were available at the time inside a cage with a bunch of mice to see which wire insulation the critters didn't like! They only used wires that passed the "mouse test."


Two women operating the ENIAC's main control panel (L: Betty Jennings, R: Frances Bilas) Credit: U.S. Army Photo.

At the time, the ENIAC was definitely fast: it could calculate 5,000 additions, 357 multiplications or 38 divisions in one second - a thousand time faster than any other calculating machines of the time.

World War II ended before the ENIAC was completed. Nevertheless, the military continued to support the project (the first ENIAC calculations were for a hydrogen bomb project).

The invention of ENIAC was a watershed moment in computing history. It was the computer that proved electronic digital computing was possible. Indeed, many computer scientists regard that there are two epochs in computer history: Before ENIAC and After ENIAC. We owe the birth of the first modern electronic computer to war spending!

Note: Though Colossus was constructed before ENIAC, it wasn't "Turing complete" (Colossus has a specialized function and couldn't be used for general calculations like ENIAC). Also, the existence of Colossus was kept secret until the 1970s, well after the birth of ENIAC. The Zuse Z3, on the other hand, was Turing complete but it wasn't electronic.

In an ensuing legal battle to break ENIAC's patent, another machine called the Atanasoff-Berry Computer (or ABC) was deemed by the courts as the first computer. Most computer scientists, however, didn't consider this legal decision as scientifically correct: ABC wasn't programmable and wasn't Turing complete, so they still considered ENIAC to be the first true computer.

World's First Computer "Bug"

On September 9, 1945, U.S. Navy officer Grace Hopper found the first computer "bug": a moth stuck between the relays on the Harvard Mark II (successor to the Mark I above) She noted it on her log as the "first actual case of bug being found." Though the term "bug" had meant a computer error beforehand, it became a popular term after this incident.

Hopper went on to create the first compiler for a computer programming language (the A-0 System for the UNIVAC in 1952) and worked on the development of COBOL, one of the earliest high-level programming languages that allowed programmers to use words instead of machine codes. To acknowledge her contributions, the U.S. Navy named a ship after her (it's a guided missile destroyer, by the way).

Even if you've never heard of Grace Hopper before reading this article, chances are you've heard one of her famous quotes: "It's easier to ask forgiveness than it is to get permission."


I'll be the first to admit that this post - long as it is - doesn't give a complete picture of the development of early computers. We haven't talked about the contributions of Nikola Tesla (he invented the electromechanical AND gate), Vannevar Bush (who brought analog computing to new heights with his differential analyzer and who pioneered the concept of memex, a theoretical idea similar to the World Wide Web) the invention of vacuum tubes, and so on.

After ENIAC, there was an explosive growth in computer science. In 1947, three Bell Lab engineers - William Shockley, John Bardeen and Walter Brattain, sparked a revolution in computing by inventing the transistor. Computers went commercial as mainframes, and later, became "personal" as they become smaller and faster.

One thing is for sure: we're living in the golden age of computer. Computing power continually becomes faster and faster, and we're finding new ways to use computers in our daily lives.

Previously on Neatorama: The Wonderful World of Early Photography


leibniz's stepped reckoner is a hurdy gurdy in disguise!

there's a guy called nicholas gessler who is a historian of early computing - he has a collection of 'physical memory' devices which is fascinating. you can read an interview with him here:

http://theiff.org/publications/cab21-gessler.html
Abusive comment hidden. (Show it anyway.)
What about the invention of the nerdy glasses, the pocket protector and disdain for non-techies who ask asinine questions like, "How do I fix the cup holder on my PC?" Sure its not as important as a programming language, but significant nonetheless. Am I right? (Smile)

Jim Stroud
Technical Sourcing Consultant
Microsoft
Abusive comment hidden. (Show it anyway.)
You have forgotten the efforts of Ada Byron (lady of Lovelace) and Herman Hollerith. Ada worked with Charles Babbage and came up with programming concepts used even today. Hollerith used punched cards for input on his machine that was used for the 1890 census. It was his machine that got IBM into the computer business. Before that they made typewriters.
Abusive comment hidden. (Show it anyway.)
Hopper did not find the bug. Some techs working on the machine did. Also, remember that the term "bug" had been in use for over sixty years before the Mark I bug. Pretty please correct the article text; this erroneous story has been circulating for decades.

See http://en.wikipedia.org/wiki/Software_bug.

As noted by another commenter, Hollerith's work was also very important; IBM purchased his company, and punched card unit-record tabulating equipment evolved into the basis of modern computer industry hardware. Sales of these machines to the Germans in the 1930s also helped them track the people they wanted to round up and kill.
Abusive comment hidden. (Show it anyway.)
This is a great article. What amazes me is the Antikythera Mechanism. If they had this technology 100 to 150 years BC, then what the heck happened in the meantime? Did we all just blank out for 1000 years? Weird.
Abusive comment hidden. (Show it anyway.)
The Curta (http://en.wikipedia.org/wiki/Curta) was a hand held mechanical calculator, it's still made today as some collector piece, but very expensive (http://www.vintagecalculators.com).
Abusive comment hidden. (Show it anyway.)
Don't forget numerous slide rules and other mechanical devices. There was also a plethora of card sorting machines around the late 19th to early 20th century (such as the ones IBM famously sold to the Nazis). Ada Lovelace's contributions to computing (being the first programmer) were recognized by the DoD as well: systems were standardized on the "Ada" computer language for quite a while.

The Computer History Museum (http://www.computerhistory.org/) has a great open gallery that has a load of historical computers available, and is definitely worth a visit if you're in the area. It's even going to get one of those Difference Engines for display, albeit last I checked it was late b/c of construction issues.
Abusive comment hidden. (Show it anyway.)
Good job on the original article....but, it'd be nice if you had a "stories section" for the full length article and put a short blurb on the front page just like any other article. You'll still get your SEO (assuming you put your stories section under your main domain) and be able to digg it, it just won't take up so much scrolling space on the front page.
Abusive comment hidden. (Show it anyway.)
Thanks for the great post. Although I had seen much of this information in other contexts, it is cool to have a chance to skip through the centuries of computation all in one convenient post.

Well done!
Abusive comment hidden. (Show it anyway.)
Your article isn't actually accurate - the first electronic digital computer was NOT the ENIAC but rather the ABC (Atanasoff-Berry Computer) in 1942 at Iowa State. (see below). Atanasoff never had any vision to use it, and ISU didn't either and it sat in a basement for years.

1942
The Atanasoff-Berry Computer is completed. Built at Iowa State College (now University), the Atanasoff-Berry Computer (ABC) was designed and built by Professor John Vincent Atanasoff and graduate student Cliff Berry between 1939 and 1942. While the ABC was never fully-functional, it won a patent dispute relating to the invention of the computer when Atanasoff proved that ENIAC co-designer John Mauchly had come to see the ABC shortly after it was completed
Abusive comment hidden. (Show it anyway.)
Hi, you left out the Chinese clock which also calculated the positions of the stars, sun and moon. It was a very large an advanced mechanical piece.
Abusive comment hidden. (Show it anyway.)
@Ashley: you're right, Neatorama's not at all low bandwidth/slow computer-friendly, but I'm not sure separating this post out of the homepage would help that much ...

@VonSkippy: I'll think about it, but at the moment, it's a little more work to set up a separate "story section" than I'd have time to put in. I've never done a lick of SEO on Neatorama (unless you count "prettifying" the permalink) so that's not even a consideration. Just scroll down, people!

Thanks for those who liked the article - it was fun to write (found a lot of neat trivia that sadly didn't make it to the article).
Abusive comment hidden. (Show it anyway.)
This is a great post! A second installment exploring the explosive growth of personal computers would be equally interesting. Bill Gates is a man we love to hate, but if not for his vision, as well as that of Steve Jobs, et al, we wouldn't be communicating the way we are in this blog.
Abusive comment hidden. (Show it anyway.)
Excellent piece! Thank you. I was a little surprised to find no mention of John von Neumann nor of his paper giving the first design of a stored-program digital electronic computer. (Eckart and Mauchly's ENIAC was digital and electronic, but not stored-program. Say what you like about software today, but at least it does not involve patch bays). See: "First Draft of a Report on the EDVAC," John von Neumann, Contract No. W-670 ORD-4926. Moore School of Engineering and University of Pennsylvania. June 30,1945. http://www.virtualtravelog.net/entries/2003-08-TheFirstDraft.pdf
Abusive comment hidden. (Show it anyway.)
I'd be interested in a reference or your assertion that Kepler used the Schickard calculator.

According to the accounts I have seen he had ordered a second one built intending to give it to Kepler but it was destroyed by fire before its completion and no further mention is made in the papers so far discovered.
Abusive comment hidden. (Show it anyway.)
What happened to computing advancements between Antikythera Mechanism (100 BC) and Napier’s Bones (1614)?

Only about 1700 years are 'missing'. Did you omit them fearing you may run out of 'paper'?
Abusive comment hidden. (Show it anyway.)
An interesting point is the business models of the early computers in terms of Univac ( Sperry) vs IBM
IBM became the computer monolith or behemoth
Sperry faded away - eventually became Univac ( the power of two - early corporate merging model of what would not work out)
IBM had even obsolete technology - the hollerith or card punch machine. Sperry had a real computer
Yet the ibm people were businessman. Support , financing , sales support.
Wheras the Sperry side were scientists and technicians. If you think that this is great wait till next year. ( no sales then)
The message is one of proper balanced management - both for research and sales
Abusive comment hidden. (Show it anyway.)
Following up on 8 and 12: IBM introduced the standard puchcard in, I think, 1923. A definite influence on thousands of computer scientists and millions of dead Jews.
Abusive comment hidden. (Show it anyway.)
One important fact was left out. Since the Jacquard Loom used punchcards to determine the pattern, it was the first programmable comupter (albiet a mechanical one). Excellent article.
Abusive comment hidden. (Show it anyway.)
In their history of the ENIAC computer, Alice R. Burks and Arthur W. Burks summarize the Atanasoff achievement as follows: "He invented a new type of a serial storage module, applicable to digital electronic computing. He formulated, developed and proved the major principles involved in electronic circuits for digital computing, principles that included arithmetical operations, control, transition from one to another number base systems, transfer and storage of data, and synchronized clocking of the operations. Having applied that data storage and those principles, he constructed a well-balanced electronic computer with centralized architecture, including storage, and arithmetically controlled input/output devices. He had invented the first-ever specialized electronic computer with such a degree of multi-aspect applicability."

So the first computer was made by the son of a Bulgarian immigrant - John Atanasoff!

Look at:

http://en.wikipedia.org/wiki/John_Vincent_Atanasoff

http://www.johnatanasoff.com/the_prototype.php?sub=history
Abusive comment hidden. (Show it anyway.)
1943 was hardly the EVE of world war 2 by the way.
Nice article, thanks!

"
The birth of the world’s first electronic digital computer was ushered … by war. In 1943, on the eve of World War II, the US military realized
"
Abusive comment hidden. (Show it anyway.)
A couple of commenters ask why there were no known advancements in computing technology for a thousand years after the Antikythera Mechanism.

During the intervening period, the Roman Empire collapsed and much the fruits of European civilisation were lost in a period called the Dark Ages. It was only after hundreds of years that technology resumed its march.

Though there were some developments in other parts of the world, they didn't catch on and change history the way that the Renaissance and Industrial Revolution did.
Abusive comment hidden. (Show it anyway.)
Actually, our fingers could actually be considered the oldest known mathematical artifact.

And imagine how much easier life would be if we 8 fingers instead of 10 and therefore our numerical system would be base-8 instead of base-10!
Abusive comment hidden. (Show it anyway.)
Great post! I was at school when there were no handheld electronic calculators. We actually did use logarithmic tables! We also used slide rules ( http://en.wikipedia.org/wiki/Slide_rule ). I remember my father getting his company technicians to repair my slide rule's cursor when I broke it. I also personally had one of these: http://www.computermuseum.li/Testpage/Calc-Chadwick.htm . I'd completely forgotten that until I read this article!
Abusive comment hidden. (Show it anyway.)
The Germans were using Enigma to secure banking communications well before the war; it is because of this that the Polish has a chance to understand the machine. Their passing this information to the Allies may have been their greatest contribution to the war effort, as much as it doesn't make headlines in the history books.
Abusive comment hidden. (Show it anyway.)
Hi,

I found this article really interesting.
If it makes us think how we have evolved in terms of technology, it can also makes us think if we've know where and how to apply it.

Best regards,

José
Abusive comment hidden. (Show it anyway.)
Their passing this information to the Allies may have been their greatest contribution to the war effort, as much as it doesn’t make headlines in the history books.
Abusive comment hidden. (Show it anyway.)
The photo on your website that you have labeled "Turing and Welchman’s Bombe (Photo: La bombe de Turing)" and named the image file as bombe-turing.jpg is actually a photo of a Navy Wave and a US Navy Bombe design and constructed by Joseph Desch et al in Dayton, Ohio on the National Cash Register (NCR) industrial campus in Building 26. This would be the Bombe that was used to decode the 4 wheeled Enigma. For the matter of accuracy - you may want to relabel and rename the photo.
Abusive comment hidden. (Show it anyway.)
A second on Paul Farrier's comments, this is the classic stock photo of the Navy Bombe developed at NCR - also, in the photograph of the rebuilding of Colossus below it, Tony Sale is the gent on the right, not the left.
Abusive comment hidden. (Show it anyway.)
hey. . .it'beutful! ! ! bcuz i know a lot of early device. . .and i know hoo invinted it the early device. . hahahahaha! ! !heeheeheehee! ! !
Abusive comment hidden. (Show it anyway.)
Nice page, i guess every reader's got what they consider to be a critical omission, mine is Curta's hand-cranked mechanical calculator, which would seem to be the technological culmination of that line of development, astonishingly devised while Curta was in a nazi concentration camp..
Abusive comment hidden. (Show it anyway.)
Login to comment.
Click here to access all of this post's 70 comments




Email This Post to a Friend
"The Wonderful World of Early Computing"

Separate multiple emails with a comma. Limit 5.

 

Success! Your email has been sent!

close window
X

This website uses cookies.

This website uses cookies to improve user experience. By using this website you consent to all cookies in accordance with our Privacy Policy.

I agree
 
Learn More