People understood hot and cold a long time before we had what we call temperature, which is the way we measure heat. While other scientists had tinkered with the idea of measuring temperature, it was not until 1714 that Daniel Gabriel Fahrenheit gave us a practical and fairly accurate thermometer.
Fahrenheit had based his invention on Danish scientist Ole Roemer's alcohol-based thermometer. Roemer labeled his temperature scale with zero marked at the temperature where brine (salt water) froze and 60 as the point at which water boiled, wrote Ulrich Grigull, the late director of the Institute for Thermodynamics at the Technical University of Munich in Germany, in a 1986 conference presentation. Ice melted at 7.5 degrees on the Roemer scale, and a human body registered at 22.5.
Fahrenheit's thermometer, though, was much more accurate. He used the same freezing and boiling reference points as Roemer's scale — referred to in his writings as "Extream Cold" and "Extream Hott" — but roughly multiplied the scale by four to divide each marker on the scale into finer increments. On Fahrenheit's scale, wrote Grigull, the four reference points were: 0 (at the combined freezing temperature of brine), 30 (the freezing point of regular water), 90 (body temperature) and 240 (the boiling point of water).
These points were recalibrated after Fahrenheit's death. But there was room for improvement, so Anders Celsius made a temperature scale that was more math-friendly and Lord Kelvin made another that expanded the scale for scientific use. An article at LiveScience explains how the concept of temperature evolved and how we got our scales. The question of which measurement scale is best still depends on where you are and exactly what you are measuring. -via Digg
(Image credit: Flickr user barbbarbbarb)