Tuesday, April 29, 2014


I'm in a somewhat odd mood this morning, so today's tirade will be on a somewhat odd topic for Liberty's Torch: units of measure, specifically those for weight, volume, and length.

Let's start with an old one: the cubit. Many a schoolchild, upon his first encounter with the story of Noah's Ark, asks about this unusual unit of measure. The usual explanation the Sunday school teacher offered, at least back in those darkly remembered days of mine own, was "it's about nineteen inches." In fact, a cubit is a nonspecific unit: it refers to the distance between a man's elbow and the tip of his middle finger. Thus, an individual carpenter could measure things in his own personal cubit, but he'd have trouble working with another carpenter with longer or shorter arms...at least, if precision were of any great importance to either of them.

A unit more familiar to us, the foot, began life in an equally nonspecific fashion: it referred to the length of an ordinary man's foot, a unit of moderate importance to the Roman legions. (Yes, they had quartermasters, too.) It became important enough to "nail down" as a fixed length, independent of any particular person's foot, only after many bunions and blisters. Similarly, the inch was the colloquial term for the length of the outermost phalange of an ordinary man's thumb. The use of that digit for rough measure in inches is still practiced today, though perhaps not by pianists or basketball players.

Passing now to units of weight and volume, we first meet the pint at the markets along the English seacoast and major rivers. As waterborne trade proliferated and older, less specific units such as the bushel and peck ceased to be adequate, there arose a need for a widely agreed-upon standard in which to trade units of volume. This applied with special force to that quintessential trade good, ale, which for some years functioned as a kind of money among the lower classes of the British Isles. However, ales varied in quality then as now, so the first standard proposed, "enough to get a teenaged virgin drunk," was deemed unacceptable. Eventually, the pint was agreed to be a cube three inches on a side, a definition that has endured to this day.

Units of weight followed from units of volume. The pound, the most important unit of dry measure, was defined as the weight of one pint of seawater. This excellent definition proceeded from the near-universal access to seawater of English markets. The uniformity of seawater ensured that any differences among "standard pounds" used in markets island-wide would be small enough to be tolerable. As balance scales were the standard weight-measuring implements of those days, the pound was swiftly divided into sixteen ounces, facilitating the use of the most convenient fractions of a pound: the reciprocal powers of two.

Note the respect the above units grant to common fractions in weight, volume, and distance measurement. Weight and volume facilitate binary division down to two to the minus-fourth power. Distance measurement's fundamental unit, the foot of twelve inches, facilitates halves, thirds, quarters, sixths, and (of course) twelfths. This is no accident: the persons who were first to use those measurements wanted the ability to easily create those fractional measures. Trade in small quantities demanded it.

The aggregation of small units into larger numbers sometimes followed intuitive rules. Eight pints were stacked into a larger cube -- two by two by two -- to create the gallon, one-quarter of which became the quart. Three feet became a yard, the most important clothier's unit. Far less intuitively, fourteen (!) pounds became a stone, thirty-two pounds became a slug, two thousand pounds became a ton, 220 yards became a furlong, and 1760 yards became the standard mile. These "Imperial system" units proliferated worldwide by the ever-expanding scope of British trade.

As polynomial numeration eclipsed Roman numeration and the metrical sciences advanced, it became ever clearer that computation in large or very small measures would benefit from a decimal system of units. The horror of the French Revolution gave birth to the first such, the metric system, in which each unit was related to others of its kind by some power of ten. The metric units were first standardized in those years, but have been gradually redefined to take account of advances in the understanding of light waves and fundamental particles. A summary of the current "international" metric standard can be found here.

Quite a lot of derision has been spilled upon the United States for "not going metric." This is understandable from both perspectives. Americans, with our large, densely interconnected domestic markets, are reluctant to convert from our familiar units to something to which we have no experiential or emotional attachment. Metric-system countries find it inconvenient to make use of the many thousands of American products made according to our "Imperial system" based unit scheme.

Fortunately, some years ago the federal government backed away from its attempt to impose the metric system on Americans and our institutions by law. There might not be a revolution over ObamaCare, but you can bet your bottom dollar that we'd get out the torches and pitchforks over having to relabel every product in our stores, reshoot every commercial that ever appears on television, and redo all the signs along our millions upon millions of miles of roads, merely to "metrify" them. And what of the publishing industry? Millions of books already in print are lousy with Imperial units. Consider especially the horror that would be "metrified porn:" "Deftly he slid his twenty-five-centimeter joystick into her welcoming love tunnel, buried his face in her velvety hundred-centimeter bosom, and began to newton away." Unthinkable!

No, we're better off retaining both systems, one for science and technology, the other for common and customary uses. At least for the present, until speeders can get used to hearing cops accuse them of going 145.2 in an 88.7 zone, and grocers can accustom themselves to being asked for 226.8 grams of fresh mozzarella.


FrankC said...

Now please explain why an American pint is smaller than a British one.

Erbo said...

That's not to say that we don't use the metric system in places where it makes sense to do so. The 2-liter soda bottle, for instance, has proven to be a convenient form of packaging.

Joseph said...

Why should a decimal system be considered better? In the Computer Age, shouldn't we use hexadecimal? Let's see... There are 0x10 ounces in a pound or pints in a peck...

Roy said...

One interesting fact about all of this is that after the American Revolution, the USA rejected the British monetary system of pound sterling and settled on a decimal monetary system based on the dollar that we have today - ten pennies to a dime, ten dimes to a dollar etc. Though we still retain the quarter and half-dollar, we no longer have shillings etc.

Ironically, had the French revolution happened *before* the American one, we might have become an early leader in adopting the metric system 200 years ago.

One last point I will make is that I work in the tech industry, and I have no patience with people who denigrate the US because we haven't "embraced" the metric system. The fact is, we have. It's just that we have refused to simultaneously reject the tried and true SAE system. We actually do both and do them both very well. While we still use inches and feet, and quarts and gallons for our day to day measurements, almost all machinery and electronics nowadays - even American made - use the metric system.