Measuring Units In Folklore And History
In the biblical story of Noah, the ark was supposed to be 300 cubits long and 30 cubits high. Like all early units of size, the cubit was based on the always-handy human body, and was most likely the length of a man's forearm from elbow to fingertip. You could measure a board, for example, by laying your forearm down successively along its length. In the Middle Ages, the inch is reputed to have been the length of a medieval king's first thumb joint. The yard was once defined as the distance between the nose of England's King Henry I and the tip of his outstretched middle finger. The origin of the foot as a unit of measurement is obvious.
In Renaissance Italy, Leonardo da Vinci used what he called a braccio, or arm, in laying out his works. It was equal to two palmi, or palms. But arms and palms, of course, will differ. In Florence, the engineers used a braccio that was 23 inches long, while the surveyors' braccio averaged only 21.7 inches. The foot, or piede, was about 17 inches in Milan, but only about 12 inches in Rome.
Eventually, ancient "rules of thumb" gave way to more carefully defined units. The metric system was adopted in France in 1799 and the British Imperial System of units was established in 1824. In 1893, the English units used in the United States were redefined in terms of their metric equivalents: the yard was defined as 0.9144 meter, and so on. But English units continue to be used in the United States to this day, even though the Omnibus Trade and Competitiveness Act of 1988 stated that "it is the declared policy of the United States...to designate the metric system of measurement as the preferred system of weights and measures for United States trade and commerce."
English units are based on inconsistent standards. When that medieval king's thumb became regrettably unavailable for further consultation, the standard for the inch was changed to the length of three grains of barley, placed end to end—not much of an improvement. Metric units, on the other hand, are based on defined and controlled standards, not on the whims of humans.
The standards behind the English units are not reproducible. Arms, hands, and grains of barley will obviously vary in size; the size of a 3-foot yard depends on whose feet are in question. But metric units are based on standards that are precisely reproducible, time after time.
There are many English units, including buckets, butts, chains, cords, drams, ells, fathoms, firkins, gills, grains, hands, knots, leagues, three different kinds of miles, four kinds of ounces, and five kinds of tons, to name just a few. There are literally hundreds more. For measuring volume or bulk alone, the English system uses ounces, pints, quarts, gallons, barrels and bushels, among many others. In the metric system, on the other hand, there is only one basic unit for each type of quantity.
Any measuring unit, in whatever system, will be too big for some applications and too large for others. People would not appreciate having their waist measurements in miles or their weights in tons. That's why we have inches and pounds. The problem, though, is that in the American system the conversion factors between various-sized units—12 inches per foot, 3 feet per yard, 1,760 yards per mile. They're completely arbitrary. Metric units, on the other hand, have conversion factors that are all powers of ten. That is, the metric system is a decimal system, just like dollars and cents. In fact, the entire system of numbers is decimal, based on tens, not threes or twelves. Therefore, converting a unit from one size to another in the metric system is just a matter of moving the decimal point.
Science EncyclopediaScience & Philosophy: Methane to Molecular clockMetric System - Measuring Units In Folklore And History, The Metric Units, Bigger And Smaller Metric Units, Converting Between English And Metric Units