Measurement Brain Teaser - What constant is the basis for the meter?

A

Al Dyer

#1
We know that at the time, the constant for a yard was the length from the King of Englands nose to the tip of his fingers.

What constant is the basis for the meter?

I will give the winner a copy of ISO:9001:2000
________________________________________:bigwave:
 
Elsmar Forum Sponsor
J
#2
According to Encarta:

The meter (m), which is approximately 39.37 inches, was originally defined as one ten-millionth of the distance from the equator to the North Pole on a line running through Paris.


James
:bigwave:

PS: Can I trade the 9K2K for a nice zinfandel?
 

Jerry Eldred

Forum Moderator
Super Moderator
#3
Guess I'd better chime in...

THE ENGLISH SYSTEM
The measurement system commonly used in the United States today is nearly the same as that brought by the colonists from England. These measures had their origins in a variety of cultures-Babylonian, Egyptian, Roman, Anglo-Saxon, and Norman - French. The ancient "digit," "palm,","span," and "cubit" units evolved into the "inch," "foot," and "yard" through a complicated transformation not yet fully understood.

Roman contributions include the use of the number 12 as a base (our foot is divided into 12 inches) and words from which we derive many of our present measurement unit names. For example, the 12 divisions of the Roman "pes," or foot, were called unciae. Our words "inch" and "ounce" are both derived from that Latin word.

The "yard" as a measure of length can be traced back to the early Saxon kings. They wore a sash or girdle around the waist that could be removed and used as a convenient measuring device. Thus the word "yard" comes from the Saxon word "gird" meaning the circumference of a person's waist.

Standardization of the various units and their combinations into a loosely related system of measurement units sometimes occurred in fascinating ways. Tradition holds that King Henry I decreed that the yard should be the distance from the tip of his nose to the end of his thumb. The length of a furlong (or furrow-long) was established by early Tudor rulers as 220 yards. This led Queen Elizabeth I to declare, in the 16th century, that henceforth the traditional Roman mile of 5,000 feet would be replaced by one of 5,280 feet, making the mile exactly 8 furlongs and providing a convenient relationship between two previously ill- related measures.

Thus, through royal edicts, England by the 18th century had achieved a greater degree of standardization than the continental countries. The English units were well suited to commerce and trade because they had been developed and refined to meet commercial needs. Through colonization and dominance of world commerce during the 17th, 18th, and 19th centuries, the English system of measurement units was spread to and established in many parts of the world, including the American colonies.

However, standards still differed to an extent undesirable for commerce among the 13 colonies. The need for greater uniformity led to clauses in the Articles of Confederation (ratified by the original colonies in 1781) and the Constitution of the United States (ratified in 1790) giving power to the Congress to fix uniform standards for weights and measures. Today, standards supplied to all the States by the National Institute of Standards and Technology assure uniformity throughout the country.

THE METRIC SYSTEM
The need for a single worldwide coordinated measurement system was recognized over 300 years ago. Gabriel Mouton, Vicar of St. Paul in Lyons, proposed in 1670 a comprehensive decimal measurement system based on the length of one minute of arc of a great circle of the earth. In 1671, Jean Picard, a French astronomer, proposed the length of a pendulum beating seconds as the unit of length. (Such a pendulum would have been fairly easily reproducible, thus facilitating the widespread distribution of uniform standards.) Other proposals were made but over a century elapsed before any action was taken.

In 1790 in the midst of the French Revolution, the National Assembly of France requested the French Academy of Sciences to "deduce an invariable standard for all the measures and all the weights." The Commission appointed by the Academy created a system that was, at once, simple and scientific. The unit of length was to be a portion of the earth's circumference. Measures for capacity (volume) and mass were to be derived from the unit of length, thus relating the basic units of the system to each other and to nature. Furthermore, the larger and smaller version of each unit were to be created by multiplying or dividing the basic units by 10 and its powers. This feature provided a great convenience to users of the system, by eliminating the need for such calculations as dividing by 16 (to convert ounces to pounds) or by 12 (to convert inches to feet). Similar calculations in the metric system could be performed simply by shifting the decimal point. Thus the metric system is a “base-10” or “decimal” system.

The Commission assigned the name metre-meter-to the unit of length. This name was derived from the Greek word metron, meaning "a measure." The physical standard representing the meter was to be constructed so that it would equal one ten-millionth of the distance from the north pole to the equator along the meridian of the earth running near Dunkirk in France and Barcelona in Spain.

The metric unit of mass, called the "gram," was defined as the mass of one cubic centimeter (a cube that is 1/100 of a meter on each side) of water at its temperature of maximum density. The cubic decimeter (a cube 1/10 of a meter on each side) was chosen as the unit of fluid capacity. This measure was given the name "liter."

Although the metric system was not accepted with enthusiasm at first, adoption by other nations occurred steadily after France made its use compulsory in 1840. The standardized character and decimal features of the metric system made it well suited to scientific and engineering work. Consequently, it is not surprising that the rapid spread of the system coincided with an age of rapid technological development. In the United States, by Act of Congress in 1866, it was made "lawful throughout the United States of America to employ the weights and measures of the metric system in all contracts, dealings or court proceedings."

By the late 1860's, even better metric standards were needed to keep pace with scientific advances. In 1875, an international treaty, the "Treaty of the Meter," set up well-defined metric standards for length and mass, and established permanent machinery to recommend and adopt further refinements in the metric system. This treaty, known as the Meter Convention, was signed by 17 countries, including the United States.

As a result of the Treaty, metric standards were constructed and distributed to each nation that ratified the Convention. Since 1893, the internationally agreed-to metric standards have served as the fundamental measurement standards of the United States.

By 1900 a total of 35 nations- including the major nations of continental Europe and most of South America-had officially accepted the metric system. In 1971 the Secretary of Commerce, in transmitting to Congress the results of a 3-year study authorized by the Metric Study Act of 1968, recommended that the U.S. change to predominant use of the metric system through a coordinated national program. The Congress responded by enacting the Metric Conversion Act of 1975. Section 5164 of Public Law 100-418 requires federal agencies to use the metric system by 1992.

The International Bureau of Weights and Measures located at Sevres, France, serves as a permanent secretariat for the Meter Convention, coordinating the exchange of information about the use and refinement of the metric system. As measurement science develops more precise and easily reproducible ways of defining the measurement units, the General Conference on Weights and Measures-the diplomatic organization made up of adherents to the Convention-meets periodically to ratify improvements in the system and the standards.

In 1960, the General Conference adopted an extensive revision and simplification of the system. The name Le Systeme International d’Unites (International System of Units), with the international abbreviation SI, was adopted for this modernized metric system. Further improvements in and additions to SI were made by the General Conference in 1964, 1967-1968, 1971,1975,1979,1983,and l991.

(Do you have hardbound edition available..just kidding)
 

Jerry Eldred

Forum Moderator
Super Moderator
#4
By the way, while we're on such trivia (Hey. It's Friday and I've had an exhausting week)...

My earliest traceable ancestor was King Aeldred or Ethelred or some derivation. This was supposed to be the first Saxon King. So in some strange ways, you could say I am the direct descendant of the King of the Yard. I sort of own it. I think I should charge royalties to everyone who uses the "Yard".

Have a nice weekend folks.
 
A

Al Dyer

#5
JKRH:

Great answer and completelt true until it was determined that that length was not contant "enough"

Sorry, Wrong, it has been revised.:bigwave:

Jerry:

Thanks for the history lesson, it is great and correct. The one point that you missed it, what is the current constant used for determining the length of a meter. See JKRH's response.
______________________________________________

The answer surprised me, but makes perfect sense when you think about it. It has to be a constant. Geographical dimensions of the Earth are ever changing and therfore not a constant.:bigwave:
 

Jerry Eldred

Forum Moderator
Super Moderator
#7
I did find this other small bit of trivia on the meter (39.37 inches)..


In 1889, a new physical realization of the meter, the International Prototype Meter was legalized by the 1'st General Conference on Weights and Measures and constructed. This new realization, although constructed to agree in length with the 1799 bar, was an arbitrary standard. That is, it was not required to conform to any natural or absolute standard, rather, it was used to define the unit of length called the meter. The 1889 legislation defined the meter as the distance, at 0 degrees Celsius, between the center portions of two lines graduated on the polished surface of a particular bar of platinum-iridium alloy. The material platinum-iridium was used because it is hard, resists oxidization, takes a very high mirror polish, and has a low coefficient of thermal expansion. The original International Prototype Meter is housed at the Bureau International de Poids et Measures (BIPM) in Sevres, France.

29 copies of the International Prototype Meter were also constructed at the BIPM and distributed to other countries. Prototype Meter No. 27 was given to the United States and is now housed at the National Institute of Standards and Technology (NIST) in Gaithersburg, Maryland. Accurate comparisons between the secondary standards and objects of unknown length could be made using a longitudinal optical comparator. The precision of optical comparisons is limited to approximately one part in ten million (0.1 ppm). As science, technology, and commerce advanced in the twentieth century this level of precision became inadequate.

The pioneering work of the American scientist Albert Michelson, involving optical interferometric techniques, paved the way for a new and more precise definition of the meter. An interferometer is a device used to measure accurately the wavelength of light. In 1960, the 11'th International Conference on Weights and Measures defined the meter to be 1,650,763.73 times the wavelength (in vacuum) of a particular orange colored light emitted by an isotope with, atomic mass of 86, of the element Krypton.

The advantage of the krypton standard is obvious. Since all Kr-86 atoms are alike, this atomic length standard is universally accessible to any suitably equipped scientific laboratory. It is not necessary to keep a "prototype krypton-86 atom" at the BIPM for reference as was the case with 1889 International Prototype Meter. Unknown lengths could be compared with the standard by the use of optical interferometry. However, the wavelength of the emitted light is slightly uncertain due to quantum mechanical effects that occur in the krypton atom during the emission process. These uncertainties limit the absolute precision of the krypton-86 length standard to the 1 to 3 parts per billion level. Still, this is a clear improvement over the 1889 length standard.

The current standard of length was defined during the 1983 International Conference on Weights and Measures. This standard of length is quite different from the 1889 and 1960 standards since it is defined in terms of time. The 1983 standard defines the meter to be the distance traveled by light in vacuum in 1/299,792,458 of a second. The precision of this length standard is approximately one part per ten trillion, a factor of one-million improvement over the 1889 standard. The basic rational for this standard is the precision of time interval measurement provided by that the current generation of atomic clocks. This level of precision of time measurements combined with the assumed consistency of the speed of light in vacuum allow this "natural" standard of length to be defined. The consistency of the speed of light is vacuum was a fundamental postulate of Albert Einstein's theory of special relativity. A consequence of this definition of the meter is that the speed of light in vacuum is defined to have a value of 299,792,458 m/s. This result is in excellent accord with the best experimental determinations of the speed of light in vacuum, (299,792,458 +- 1) m/s.

Practical realizations of the 1983 length standard using frequency stabilized lasers allow the change in position (i.e. displacement) of an object or event in the millimeter range to be measured with an uncertainty of one-picometer, the change in position in the meter range to ten-nanometers, and the "length" (the distance between the endpoints) of a sub-meter sized object to an uncertainty of one-nanometer. The measurement of the length of objects to a lower uncertainty is precluded by the slight deformations of the object under measure by the measuring apparatus.
 
A

Al Dyer

#8
Jerry:

Sir, you are correct. The meter is based on the constant of the speed of light. I believe it equates to 1 billionth the speed of light.

I'll have to make the next one tougher!

Your manual is in the post.:bigwave:
 

Mike S.

Happy to be Alive
Trusted Information Resource
#9
Al Dyer said:

Jerry:

Sir, you are correct. The meter is based on the constant of the speed of light. I believe it equates to 1 billionth the speed of light.
I only skimmed the last few posts, but either my memory is really screwed-up or this post is wrong. From my Physics classes (too many years ago) I remember (I think!) the speed of light being 300 million meters per second, so 1 meter can't be 1 billionth the speed of light. Or, forgive me if this is the case, I've gone senile.:eek:

Mike S.
 

Jerry Eldred

Forum Moderator
Super Moderator
#10
After refreshing my memory, and re-reading what I posted (mostly cut-and-pasted), it isn't a portion of the "speed of light". It is based on assuming knowledge of the speed of light and of knowing the presumed characteristic wavelength of Cesium (which is the current atomic frequency reference). The length of the meter is theoretically the distance light travels in "APPROXIMATELY" 1 300 billionth of a second. So it is a distance versus a speed. We presume that light travels at "APPROXIMATELY" (I have to put that in for all you metrologists out there) 300 billion meters per second. So although historically, the meter was created using a fraction of the distance between the north pole and the equator (variable) we decided to put a lasso on it, and declare that this elusive meter (which will change over time) will be reverse calculated calculated back into a relationship between a presumed wavelength ("circular" traceability if you analyze it a little too much) and a presumed speed of energy traveling through space. We declare the meter to be based on a declared speed which we in turn derive from the declared distance (meter)... and so on around the circle...

I don't think you've gone senile. I'll have to take the prize for senility.
 
Thread starter Similar threads Forum Replies Date
K Alternatives to Metal Foil for touch current measurement of enclosures made of insulating material. IEC 60601 - Medical Electrical Equipment Safety Standards Series 2
M What should be measurement method in control plan if you are defining Control method as work instruction. Manufacturing and Related Processes 5
K ISO9001 - Measurement Equipment records ISO 9000, ISO 9001, and ISO 9004 Quality Management Systems Standards 12
S Specification (Significant digits) - Measurement (Medical device) ISO 13485:2016 - Medical Device Quality Management Systems 3
S How to report a measurement and relate it to calibration certificate General Measurement Device and Calibration Topics 1
B IATF 7.1.5.1.1 Measurement system analysis (Visual Inspection) IATF 16949 - Automotive Quality Systems Standard 3
B Leakage current measurement without isolating transformer IEC 60601 - Medical Electrical Equipment Safety Standards Series 2
V Over 60 dimensions CMM measurement in control plan APQP and PPAP 7
C MSA for resistance measurement Gage R&R (GR&R) and MSA (Measurement Systems Analysis) 4
F Multi-Sensor CMM Measurement General Measurement Device and Calibration Topics 5
P Type of MSA study according to measurement equipment Gage R&R (GR&R) and MSA (Measurement Systems Analysis) 2
CaliperJim Temperature and Humidity Effects on Measurement Devices General Measurement Device and Calibration Topics 6
E Audit Finding - Measurement of Process - Continuous Improvement - Trend Analysis Oil and Gas Industry Standards and Regulations 22
Rene Minassian Control of (TMME) Testing and Measurement Equipment Oil and Gas Industry Standards and Regulations 2
R DIGITAL MEASUREMENT DEVICE Manufacturing and Related Processes 2
R First Time Managing Calibration and Measurement System. Need Help. General Measurement Device and Calibration Topics 19
J Not able to repeat measurement at the same location General Measurement Device and Calibration Topics 7
Ron Rompen Convert Linear Measurement to Angular Measurement General Measurement Device and Calibration Topics 3
Q Resolution of Measurement Device for a 50' Coil Reliability Analysis - Predictions, Testing and Standards 1
Ron Rompen Assistance with perpendicularity measurement Inspection, Prints (Drawings), Testing, Sampling and Related Topics 10
G QSB+ : capability of measurement equipment IATF 16949 - Automotive Quality Systems Standard 15
B Measurement Calculations vs ASME Y14.5 Various Other Specifications, Standards, and related Requirements 2
S Measurement of threaded holes ISO10578 or ASME standard Measurement Uncertainty (MU) 3
S Comparing measurement results for a part from two ring gage General Measurement Device and Calibration Topics 3
B Building my own measurement equipment General Measurement Device and Calibration Topics 3
P Test Method Validation (TMV) for all Measurement Methods in Rec/Inspection Gage R&R (GR&R) and MSA (Measurement Systems Analysis) 1
J Keyence Measurement Systems General Measurement Device and Calibration Topics 3
Q Measurement Equipment Revocation - Looking for a Disposal Form with Risk Assessment IATF 16949 - Automotive Quality Systems Standard 10
C IVD Metrological traceability without a reference measurement procedure General Measurement Device and Calibration Topics 11
E Any template/ form of Monitoring and Measurement of Processes and product to ISO 13485? ISO 13485:2016 - Medical Device Quality Management Systems 1
A Flexible "Ring" Roundness Measurement General Measurement Device and Calibration Topics 2
M IATF 16949 SI # 10: Integrated self-calibration of measurement equipment- Needs for explanation IATF 16949 - Automotive Quality Systems Standard 2
I QMS monitoring, measurement, analysis and evaluation requirement - Template ISO 9000, ISO 9001, and ISO 9004 Quality Management Systems Standards 9
F Physical Part Measurement General Measurement Device and Calibration Topics 6
M Measurement Uncertainty in Optical Microscopy Measurement Uncertainty (MU) 1
B ISO 6508 and portable hardness measurement instruments General Measurement Device and Calibration Topics 0
C How to Establish the Calibration & Measurement Capability (CMC)? ISO 17025 related Discussions 1
Ron Rompen Measurement of residual magnetism General Measurement Device and Calibration Topics 2
M Surgical angle measurement guide device with an application software Medical Device and FDA Regulations and Standards News 4
M Disabling measurement data during fault conditions IEC 60601 - Medical Electrical Equipment Safety Standards Series 5
J Most versatile measurement device General Measurement Device and Calibration Topics 2
P Surface scratch measurement General Measurement Device and Calibration Topics 1
N Drawing tolerance vs. Measurement device General Measurement Device and Calibration Topics 7
Ron Rompen MSA on automated measurement system - Multiple Step Vision System Gage R&R (GR&R) and MSA (Measurement Systems Analysis) 3
P Thermo-Hygrometer - Measurement of 10DegC @10%RH ISO 17025 related Discussions 4
hogheavenfarm GDT Flatness measurement question Inspection, Prints (Drawings), Testing, Sampling and Related Topics 10
K Software Validation for Measurement Tools used in Process Validation ISO 13485:2016 - Medical Device Quality Management Systems 2
K Best Measurement Systems Demos in California? General Measurement Device and Calibration Topics 1
M Measurement Error - How to determine what is acceptable? Gage R&R (GR&R) and MSA (Measurement Systems Analysis) 3
K Process Mapping - Inputs/Outputs/Detail Activities/Control points/Measurement? Process Maps, Process Mapping and Turtle Diagrams 4

Similar threads

Top Bottom