Starbulletin.com

Facts of the Matter

Richard Brill


Temperature scales took
centuries of fine-tuning


Is it cold or is it me? Measurements of that unspecified "it" are so commonplace today that it's easy to forget that through most of history, people had no way of knowing the temperature of anything.


art
ASSOCIATED PRESS
There are 180 degrees between the freezing and boiling points of water on the Fahrenheit scale. Celcius uses 100 degrees.


Temperature is abstract to begin with, depending on concepts of "hot" and "cold" that are based purely on objective sensory impressions. For example, the ocean will feel cold when you first jump in, warm after being in for a while, then downright tepid when jumping back in after a few seconds in the wind.

Before the 19th century, heat and temperature were indistinguishable, and thermometers were thought to measure heat.

Likewise hot and cold were thought to be independent but interacting properties, a kind of thermal yin-yang. Thermometry helped to show that they were distinct aspects of the same thing and led to our modern kinetic model of heat.

Today we distinguish temperature and heat as separate but related concepts, heat being a form of molecular energy and temperature a measure of the energy of molecules. This kinetic model visualizes molecules in constant motion, having faster motion and more energy at higher temperature, and transferring heat by collisions between molecules.

To learn to distinguish between heat and temperature, it was necessary to be able to measure temperature accurately and reliably. This is known as thermometry, meaning "measurement of heat," reflecting the historical misconception of the distinction between heat and temperature.

The thermometer (measurer of heat) is a device that measures a consistent change in some property as temperature changes. The first thermometers measured the expansion of a gas, but today's thermometers measure change in size, electrical resistance or emission of infrared radiation.

One of the first attempts to measure temperature and relate it to a standard temperature scale was in 170 A.D., when Galen, the seminal Roman physician, proposed a standard "neutral" temperature made by mixing equal quantities of boiling water and ice with "degrees" of hot and cold on either side. He didn't suggest any way to relate these degrees to anything.

Little changed until the late 16th century, when there arose curiosity and puzzlement throughout Europe about the expansion of air upon heating. No one could figure out how to measure the effect or use it to measure temperature. Because there were so many individuals working with the concept, there is quite a bit of conflicting information today about who invented what and when. But a few instances stand out.

The first devices to take advantage of air's expansion and contraction to indicate temperature were called thermoscopes, invented in numerous variations by different inventors, especially flourishing in what is now Italy. These instruments came to be called "Florentine thermoscopes," numerous versions of which became popular as parlor amusements. Galileo Galilei invented one in 1593 that displayed temperature variations but could not relate a number to a temperature. In 1610 he filled one with wine, thereby inventing the first alcohol thermoscope.

Despite the thermoscope fad, it would be 150 years before there was a thermometer that could reliably assign a number to "the temperature." Some Florentine thermoscopes had scales to mark certain temperatures, but they were unique to a particular location and no two were alike. In most cases the low point on the scale marked the coldest day that year, and the high point marked the hottest day.

As nifty as they were, all Florentine thermoscopes were based on expansion and contraction of the air in a tube. That meant that they also registered changes in atmospheric pressure and were not very accurate. The first thermometer that was sealed from atmospheric pressure was designed in 1641 for the grand duke of Tuscany. It used alcohol, which freezes at a lower temperature than water, and it had degree marks but no numbers. It still used a pair of past local temperature extremes as reference points.

What was needed was the concept that certain things always represent the same "temperature."

The man who first thought of using the freezing point of water as the "zero" or starting point was the English scientist Robert Hooke, in 1664. Hooke was a first-class thinker who also invented the microscope, founded the Royal Society of London, discovered a fundamental law of elasticity and planted the seeds of gravitation in Isaac Newton's mind, among other things.

On Hooke's scale, one degree represented a change of volume equivalent to about1500 (0.2 percent) of the volume of the thermometer liquid. This scale needed only one fixed point, and Hooke selected the freezing point of water. By scaling it this way, Hooke spawned the concept that a standard scale could be established for thermometers of a variety of sizes and the idea that some things, such as freezing water, always happen at the same temperature.

Hearing of Hooke's work, a Danish astronomer named Roemer (who also made the first accurate measurement of the speed of light) introduced a slight variation in the concept in 1702. He decided to use two fixed points instead of one and chose melting ice and boiling water as the reference points. This avoided any inaccuracies that might be expected in measuring small volume changes as in Hooke's thermometer.

The next step was taken around 1720 by Daniel Gabriel Fahrenheit, an instrument maker from Poland who worked in the Netherlands. Fahrenheit wanted thermometers to be precision instruments that gave reproducible results. He realized that the secret wasn't the range of temperature on a particular day or place, but finding the right materials for the thermometer and the right conditions for standard reference temperatures.

For seven years he worked on perfecting a temperature scale and a precision thermometer that used mercury as the thermometric liquid. His research had discovered that mercury has very nearly ideal properties: It conducts heat quickly; its thermal expansion is large and uniform; it does not adhere to glass; it remains a liquid over a wide range of temperatures; its silvery appearance makes it easy to read.

As brilliant as his use of mercury as a working fluid was, Fahrenheit didn't see the utility of using the freezing and boiling points of water as reference points. He was more concerned with avoiding negative numbers, which were even more confusing to most people in his time than they are now. Even then the primary use of thermometers was for weather, and it often falls below freezing in the winter in northern Europe. Of course he could have used Roemer's reference points and simply shifted the scale downward, but he didn't have the benefit of our current understanding of such things.

Fahrenheit calibrated the scale of his mercury thermometer by marking the level of the mercury in a mixture of salt, ice and water, denoting this as zero. A second point on the scale is the freezing of water without salt, denoted as "30." A third point, denoted as "96," is human body temperature.

The choice of these numbers may seem odd, but Fahrenheit's system was as ingenious as it was unique. He originally used a 12-point scale with zero, four and 12 for the three reference points. Then he divided each mark into eight. So "4" became "32 degrees," correcting his original "30" as the ice melting temperature; "12" became "96 degrees," body temperature that we now correct to 98.6 degrees Fahrenheit. On this scale water boils at 212 degrees, and there are 180 degrees between freezing and boiling.

Although there were 30 or so competing systems of temperature gradation in Europe at the time, Fahrenheit's scale became enormously popular because of the precision and quality of his thermometers.

In 1745, Carolus Linnaeus, a Swedish scientist who is better known for the system of biological taxonomy that bears his name, suggested a scale that used 100 gradations between freezing and boiling of water. Anders Celsius, a Dutch scientist, had actually done it in 1742, but with a reversed scale on which water boiled at 0 degrees and froze at 100 degrees. The Celsius scale was subsequently reversed and was simply called the "centigrade" scale until it was officially changed to "Celsius" in 1948 in his honor. It is the standard degree of temperature for science and most of the world -- only the United States and England use Fahrenheit, and much of England uses the Celsius scale primarily.

The centigrade scale was named for Celsius because he was the first to perform and publish careful experiments with the hopes of establishing an international temperature scale on scientific grounds. He performed carefully controlled experiments to check that the freezing point is independent of latitude and atmospheric pressure. He measured the dependence of the boiling of water with atmospheric pressure (in excellent agreement with modern data) and derived a rule for the determination of the boiling point at other than the standard 1 atmosphere pressure.

We can now measure temperature down to fractions of microdegrees, but when it comes right down to it, when you're too cold or too hot, it really doesn't matter what the thermometer says.




Richard Brill picks up where your high school science teacher left off. He is a professor of science at Honolulu Community College, where he teaches earth and physical science and investigates life and the universe. He can be contacted by e-mail at rickb@hcc.hawaii.edu

--Advertisements--
--Advertisements--


| | | PRINTER-FRIENDLY VERSION
E-mail to Business Editor

BACK TO TOP


Text Site Directory:
[News] [Business] [Features] [Sports] [Editorial] [Do It Electric!]
[Classified Ads] [Search] [Subscribe] [Info] [Letter to Editor]
[Feedback]
© 2003 Honolulu Star-Bulletin -- https://archives.starbulletin.com


-Advertisement-