The B-17 Flying Fortress was the workhorse of World War II. Boeing designed and produced it in just 12 months for the Air Force to bomb the Axis powers and survive their missions. But there were crashes. Those crashes were blamed on pilot error, particularly since so many of the pilots were new, recruited and trained for the war. Psychologist Paul Fitts looked at the data and saw that something was very wrong.
The examples slid back and forth on a scale of tragedy to tragicomic: pilots who slammed their planes into the ground after misreading a dial; pilots who fell from the sky never knowing which direction was up; the pilots of B-17s who came in for smooth landings and yet somehow never deployed their landing gear. And others still, who got trapped in a maze of absurdity, like the one who, having jumped into a brand-new plane during a bombing raid by the Japanese, found the instruments completely rearranged. Sweaty with stress, unable to think of anything else to do, he simply ran the plane up and down the runway until the attack ended.
Fitts' data showed that during one 22-month period of the war, the Air Force reported an astounding 457 crashes just like the one in which our imaginary pilot hit the runway thinking everything was fine. But the culprit was maddeningly obvious for anyone with the patience to look. Fitts' colleague Alfonse Chapanis did the looking. When he started investigating the airplanes themselves, talking to people about them, sitting in the cockpits, he also didn’t see evidence of poor training. He saw, instead, the impossibility of flying these planes at all. Instead of “pilot error,” he saw what he called, for the first time, “designer error.”
Fitts' came up with a way to make the B-17s, and all planes built afterward, much safer by taking human behavior into account. His research led, in time, to the concept of "user friendliness." Read how that concept grew to make all of our lives easier, and how it can take us to unpleasant extremes, at Wired. -via Damn Interesting