Measurement Brain Teaser - What constant is the basis for the meter?

A

Al Dyer

#1
We know that at the time, the constant for a yard was the length from the King of Englands nose to the tip of his fingers.

What constant is the basis for the meter?

I will give the winner a copy of ISO:9001:2000
________________________________________:bigwave:
 
Elsmar Forum Sponsor
J
#2
According to Encarta:

The meter (m), which is approximately 39.37 inches, was originally defined as one ten-millionth of the distance from the equator to the North Pole on a line running through Paris.


James
:bigwave:

PS: Can I trade the 9K2K for a nice zinfandel?
 

Jerry Eldred

Forum Moderator
Super Moderator
#3
Guess I'd better chime in...

THE ENGLISH SYSTEM
The measurement system commonly used in the United States today is nearly the same as that brought by the colonists from England. These measures had their origins in a variety of cultures-Babylonian, Egyptian, Roman, Anglo-Saxon, and Norman - French. The ancient "digit," "palm,","span," and "cubit" units evolved into the "inch," "foot," and "yard" through a complicated transformation not yet fully understood.

Roman contributions include the use of the number 12 as a base (our foot is divided into 12 inches) and words from which we derive many of our present measurement unit names. For example, the 12 divisions of the Roman "pes," or foot, were called unciae. Our words "inch" and "ounce" are both derived from that Latin word.

The "yard" as a measure of length can be traced back to the early Saxon kings. They wore a sash or girdle around the waist that could be removed and used as a convenient measuring device. Thus the word "yard" comes from the Saxon word "gird" meaning the circumference of a person's waist.

Standardization of the various units and their combinations into a loosely related system of measurement units sometimes occurred in fascinating ways. Tradition holds that King Henry I decreed that the yard should be the distance from the tip of his nose to the end of his thumb. The length of a furlong (or furrow-long) was established by early Tudor rulers as 220 yards. This led Queen Elizabeth I to declare, in the 16th century, that henceforth the traditional Roman mile of 5,000 feet would be replaced by one of 5,280 feet, making the mile exactly 8 furlongs and providing a convenient relationship between two previously ill- related measures.

Thus, through royal edicts, England by the 18th century had achieved a greater degree of standardization than the continental countries. The English units were well suited to commerce and trade because they had been developed and refined to meet commercial needs. Through colonization and dominance of world commerce during the 17th, 18th, and 19th centuries, the English system of measurement units was spread to and established in many parts of the world, including the American colonies.

However, standards still differed to an extent undesirable for commerce among the 13 colonies. The need for greater uniformity led to clauses in the Articles of Confederation (ratified by the original colonies in 1781) and the Constitution of the United States (ratified in 1790) giving power to the Congress to fix uniform standards for weights and measures. Today, standards supplied to all the States by the National Institute of Standards and Technology assure uniformity throughout the country.

THE METRIC SYSTEM
The need for a single worldwide coordinated measurement system was recognized over 300 years ago. Gabriel Mouton, Vicar of St. Paul in Lyons, proposed in 1670 a comprehensive decimal measurement system based on the length of one minute of arc of a great circle of the earth. In 1671, Jean Picard, a French astronomer, proposed the length of a pendulum beating seconds as the unit of length. (Such a pendulum would have been fairly easily reproducible, thus facilitating the widespread distribution of uniform standards.) Other proposals were made but over a century elapsed before any action was taken.

In 1790 in the midst of the French Revolution, the National Assembly of France requested the French Academy of Sciences to "deduce an invariable standard for all the measures and all the weights." The Commission appointed by the Academy created a system that was, at once, simple and scientific. The unit of length was to be a portion of the earth's circumference. Measures for capacity (volume) and mass were to be derived from the unit of length, thus relating the basic units of the system to each other and to nature. Furthermore, the larger and smaller version of each unit were to be created by multiplying or dividing the basic units by 10 and its powers. This feature provided a great convenience to users of the system, by eliminating the need for such calculations as dividing by 16 (to convert ounces to pounds) or by 12 (to convert inches to feet). Similar calculations in the metric system could be performed simply by shifting the decimal point. Thus the metric system is a “base-10” or “decimal” system.

The Commission assigned the name metre-meter-to the unit of length. This name was derived from the Greek word metron, meaning "a measure." The physical standard representing the meter was to be constructed so that it would equal one ten-millionth of the distance from the north pole to the equator along the meridian of the earth running near Dunkirk in France and Barcelona in Spain.

The metric unit of mass, called the "gram," was defined as the mass of one cubic centimeter (a cube that is 1/100 of a meter on each side) of water at its temperature of maximum density. The cubic decimeter (a cube 1/10 of a meter on each side) was chosen as the unit of fluid capacity. This measure was given the name "liter."

Although the metric system was not accepted with enthusiasm at first, adoption by other nations occurred steadily after France made its use compulsory in 1840. The standardized character and decimal features of the metric system made it well suited to scientific and engineering work. Consequently, it is not surprising that the rapid spread of the system coincided with an age of rapid technological development. In the United States, by Act of Congress in 1866, it was made "lawful throughout the United States of America to employ the weights and measures of the metric system in all contracts, dealings or court proceedings."

By the late 1860's, even better metric standards were needed to keep pace with scientific advances. In 1875, an international treaty, the "Treaty of the Meter," set up well-defined metric standards for length and mass, and established permanent machinery to recommend and adopt further refinements in the metric system. This treaty, known as the Meter Convention, was signed by 17 countries, including the United States.

As a result of the Treaty, metric standards were constructed and distributed to each nation that ratified the Convention. Since 1893, the internationally agreed-to metric standards have served as the fundamental measurement standards of the United States.

By 1900 a total of 35 nations- including the major nations of continental Europe and most of South America-had officially accepted the metric system. In 1971 the Secretary of Commerce, in transmitting to Congress the results of a 3-year study authorized by the Metric Study Act of 1968, recommended that the U.S. change to predominant use of the metric system through a coordinated national program. The Congress responded by enacting the Metric Conversion Act of 1975. Section 5164 of Public Law 100-418 requires federal agencies to use the metric system by 1992.

The International Bureau of Weights and Measures located at Sevres, France, serves as a permanent secretariat for the Meter Convention, coordinating the exchange of information about the use and refinement of the metric system. As measurement science develops more precise and easily reproducible ways of defining the measurement units, the General Conference on Weights and Measures-the diplomatic organization made up of adherents to the Convention-meets periodically to ratify improvements in the system and the standards.

In 1960, the General Conference adopted an extensive revision and simplification of the system. The name Le Systeme International d’Unites (International System of Units), with the international abbreviation SI, was adopted for this modernized metric system. Further improvements in and additions to SI were made by the General Conference in 1964, 1967-1968, 1971,1975,1979,1983,and l991.

(Do you have hardbound edition available..just kidding)
 

Jerry Eldred

Forum Moderator
Super Moderator
#4
By the way, while we're on such trivia (Hey. It's Friday and I've had an exhausting week)...

My earliest traceable ancestor was King Aeldred or Ethelred or some derivation. This was supposed to be the first Saxon King. So in some strange ways, you could say I am the direct descendant of the King of the Yard. I sort of own it. I think I should charge royalties to everyone who uses the "Yard".

Have a nice weekend folks.
 
A

Al Dyer

#5
JKRH:

Great answer and completelt true until it was determined that that length was not contant "enough"

Sorry, Wrong, it has been revised.:bigwave:

Jerry:

Thanks for the history lesson, it is great and correct. The one point that you missed it, what is the current constant used for determining the length of a meter. See JKRH's response.
______________________________________________

The answer surprised me, but makes perfect sense when you think about it. It has to be a constant. Geographical dimensions of the Earth are ever changing and therfore not a constant.:bigwave:
 

Jerry Eldred

Forum Moderator
Super Moderator
#7
I did find this other small bit of trivia on the meter (39.37 inches)..


In 1889, a new physical realization of the meter, the International Prototype Meter was legalized by the 1'st General Conference on Weights and Measures and constructed. This new realization, although constructed to agree in length with the 1799 bar, was an arbitrary standard. That is, it was not required to conform to any natural or absolute standard, rather, it was used to define the unit of length called the meter. The 1889 legislation defined the meter as the distance, at 0 degrees Celsius, between the center portions of two lines graduated on the polished surface of a particular bar of platinum-iridium alloy. The material platinum-iridium was used because it is hard, resists oxidization, takes a very high mirror polish, and has a low coefficient of thermal expansion. The original International Prototype Meter is housed at the Bureau International de Poids et Measures (BIPM) in Sevres, France.

29 copies of the International Prototype Meter were also constructed at the BIPM and distributed to other countries. Prototype Meter No. 27 was given to the United States and is now housed at the National Institute of Standards and Technology (NIST) in Gaithersburg, Maryland. Accurate comparisons between the secondary standards and objects of unknown length could be made using a longitudinal optical comparator. The precision of optical comparisons is limited to approximately one part in ten million (0.1 ppm). As science, technology, and commerce advanced in the twentieth century this level of precision became inadequate.

The pioneering work of the American scientist Albert Michelson, involving optical interferometric techniques, paved the way for a new and more precise definition of the meter. An interferometer is a device used to measure accurately the wavelength of light. In 1960, the 11'th International Conference on Weights and Measures defined the meter to be 1,650,763.73 times the wavelength (in vacuum) of a particular orange colored light emitted by an isotope with, atomic mass of 86, of the element Krypton.

The advantage of the krypton standard is obvious. Since all Kr-86 atoms are alike, this atomic length standard is universally accessible to any suitably equipped scientific laboratory. It is not necessary to keep a "prototype krypton-86 atom" at the BIPM for reference as was the case with 1889 International Prototype Meter. Unknown lengths could be compared with the standard by the use of optical interferometry. However, the wavelength of the emitted light is slightly uncertain due to quantum mechanical effects that occur in the krypton atom during the emission process. These uncertainties limit the absolute precision of the krypton-86 length standard to the 1 to 3 parts per billion level. Still, this is a clear improvement over the 1889 length standard.

The current standard of length was defined during the 1983 International Conference on Weights and Measures. This standard of length is quite different from the 1889 and 1960 standards since it is defined in terms of time. The 1983 standard defines the meter to be the distance traveled by light in vacuum in 1/299,792,458 of a second. The precision of this length standard is approximately one part per ten trillion, a factor of one-million improvement over the 1889 standard. The basic rational for this standard is the precision of time interval measurement provided by that the current generation of atomic clocks. This level of precision of time measurements combined with the assumed consistency of the speed of light in vacuum allow this "natural" standard of length to be defined. The consistency of the speed of light is vacuum was a fundamental postulate of Albert Einstein's theory of special relativity. A consequence of this definition of the meter is that the speed of light in vacuum is defined to have a value of 299,792,458 m/s. This result is in excellent accord with the best experimental determinations of the speed of light in vacuum, (299,792,458 +- 1) m/s.

Practical realizations of the 1983 length standard using frequency stabilized lasers allow the change in position (i.e. displacement) of an object or event in the millimeter range to be measured with an uncertainty of one-picometer, the change in position in the meter range to ten-nanometers, and the "length" (the distance between the endpoints) of a sub-meter sized object to an uncertainty of one-nanometer. The measurement of the length of objects to a lower uncertainty is precluded by the slight deformations of the object under measure by the measuring apparatus.
 
A

Al Dyer

#8
Jerry:

Sir, you are correct. The meter is based on the constant of the speed of light. I believe it equates to 1 billionth the speed of light.

I'll have to make the next one tougher!

Your manual is in the post.:bigwave:
 

Mike S.

Happy to be Alive
Trusted Information Resource
#9
Al Dyer said:

Jerry:

Sir, you are correct. The meter is based on the constant of the speed of light. I believe it equates to 1 billionth the speed of light.
I only skimmed the last few posts, but either my memory is really screwed-up or this post is wrong. From my Physics classes (too many years ago) I remember (I think!) the speed of light being 300 million meters per second, so 1 meter can't be 1 billionth the speed of light. Or, forgive me if this is the case, I've gone senile.:eek:

Mike S.
 

Jerry Eldred

Forum Moderator
Super Moderator
#10
After refreshing my memory, and re-reading what I posted (mostly cut-and-pasted), it isn't a portion of the "speed of light". It is based on assuming knowledge of the speed of light and of knowing the presumed characteristic wavelength of Cesium (which is the current atomic frequency reference). The length of the meter is theoretically the distance light travels in "APPROXIMATELY" 1 300 billionth of a second. So it is a distance versus a speed. We presume that light travels at "APPROXIMATELY" (I have to put that in for all you metrologists out there) 300 billion meters per second. So although historically, the meter was created using a fraction of the distance between the north pole and the equator (variable) we decided to put a lasso on it, and declare that this elusive meter (which will change over time) will be reverse calculated calculated back into a relationship between a presumed wavelength ("circular" traceability if you analyze it a little too much) and a presumed speed of energy traveling through space. We declare the meter to be based on a declared speed which we in turn derive from the declared distance (meter)... and so on around the circle...

I don't think you've gone senile. I'll have to take the prize for senility.
 
Thread starter Similar threads Forum Replies Date
N Drawing tolerance vs. Measurement device General Measurement Device and Calibration Topics 4
Ron Rompen MSA on automated measurement system - Multiple Step Vision System Gage R&R (GR&R) and MSA (Measurement Systems Analysis) 3
P Thermo-Hygrometer - Measurement of 10DegC @10%RH ISO 17025 related Discussions 4
hogheavenfarm GDT Flatness measurement question Inspection, Prints (Drawings), Testing, Sampling and Related Topics 10
K Software Validation for Measurement Tools used in Process Validation ISO 13485:2016 - Medical Device Quality Management Systems 2
K Best Measurement Systems Demos in California? General Measurement Device and Calibration Topics 1
M Measurement Error - How to determine what is acceptable? Gage R&R (GR&R) and MSA (Measurement Systems Analysis) 3
K Process Mapping - Inputs/Outputs/Detail Activities/Control points/Measurement? Process Maps, Process Mapping and Turtle Diagrams 4
B IEC 60601-2-10 - Accuracy of Pulse Parameters - Required Measurement Uncertainty IEC 60601 - Medical Electrical Equipment Safety Standards Series 3
N Program or application for standard time measurement in industry from video surveillance? Human Factors and Ergonomics in Engineering 1
M Defining frequency of measurement tools callibration Calibration and Metrology Software and Hardware 3
T How to “validate/qualify” a new measurement equipment? Gage R&R (GR&R) and MSA (Measurement Systems Analysis) 4
Bev D Verification and Validation of Measurement Systems Misc. Quality Assurance and Business Systems Related Topics 0
F How do you add accuracies for tools with different measurement devices? General Measurement Device and Calibration Topics 1
M Viable count Measurement ISO Standard Other Medical Device Related Standards 1
J Gauge R&R on a torque measurement Gage R&R (GR&R) and MSA (Measurement Systems Analysis) 11
W Air Quality Measurement Hardware and Software General Measurement Device and Calibration Topics 11
O Dimension Measurement Tool Recommendation General Measurement Device and Calibration Topics 21
G Does pitch/increment/resolution of a ruled scale apply to measurement uncertainty as line item? Measurement Uncertainty (MU) 10
Marco Bernardi Plane measurement with a 3D touch machine like a CMM. Calibration and Metrology Software and Hardware 3
Marco Bernardi Diameters measurement with 3D touch machine like a CMM Calibration and Metrology Software and Hardware 4
R Uncertainty in measurement larger than tolerance Measurement Uncertainty (MU) 2
R Measurement Uncertainties Budget for Thermometer Measurement Uncertainty (MU) 6
O Performance Measurement ISO 9001: 2015 ISO 9000, ISO 9001, and ISO 9004 Quality Management Systems Standards 2
O Monitoring performance - How do I determine performance measurement basis within my organization? Misc. Quality Assurance and Business Systems Related Topics 4
G What information to put on measurement Dimensional Results APQP and PPAP 7
D O Ring capability and measurement - What is the automotive 'norm' for capability studies on O Rings? General Measurement Device and Calibration Topics 3
F Electrical Metrology (Source/Generation and Measurement) General Measurement Device and Calibration Topics 1
G Gauge R&R on multiple dimensions using 3D measurement system Capability, Accuracy and Stability - Processes, Machines, etc. 6
F Measurement Audit and ILC for ISO 17025 Clause 7.7.2 - Comparison with results of other laboratories ISO 17025 related Discussions 0
Emran.mi Measurement system analysis - Can you help me about implementation MSA for CMM device Manufacturing and Related Processes 2
K ISO 17025:2017 clause 7.6.2 - Performing calibration of its own equipment shall evaluate the measurement uncertainty ISO 17025 related Discussions 6
Q How should I analyze measurement correlation between me and customer? Gage R&R (GR&R) and MSA (Measurement Systems Analysis) 12
I Determining Calibration Tolerance of a Measurement Device General Measurement Device and Calibration Topics 2
Fernando Drumond Calibration and Control of Plastic Volumetric Measurement Instruments General Measurement Device and Calibration Topics 1
M Measurement Unit Rounding Nonconformity - Notified Body Audit Registrars and Notified Bodies 6
qualprod ISO 9001 Clause 9.1 - Monitoring measurement analysis and evaluation ISO 9000, ISO 9001, and ISO 9004 Quality Management Systems Standards 6
Q Gage R&R for Instant Measurement Machine Manufacturing and Related Processes 6
N Justifications for not performing MSA (Measurement System Analysis) Gage R&R (GR&R) and MSA (Measurement Systems Analysis) 6
T ISO 13485 8.2.5. Monitoring and measurement of quality management system processes ISO 13485:2016 - Medical Device Quality Management Systems 1
A Records required for ISO 9001 clause 7.1.5.1 - Fitness for purpose of the monitoring and measurement resources ISO 9000, ISO 9001, and ISO 9004 Quality Management Systems Standards 10
Ron Rompen Measurement Method Challenge - Measure feature #91 Inspection, Prints (Drawings), Testing, Sampling and Related Topics 9
T MSE for dynamic balance measurement? Manufacturing and Related Processes 1
R Pls help --- Need expert advice on Video Measurement Measurement Uncertainty (MU) 0
R Measurement - Center Points not taken into account Using Minitab Software 3
B True Position Measurement Inspection, Prints (Drawings), Testing, Sampling and Related Topics 9
G Reporting measurement uncertainty for custom items Measurement Uncertainty (MU) 2
F ≥5 micron macroparticles measurement in Laminar Airflow (LAF) bench (ISO Class 5) Miscellaneous Environmental Standards and EMS Related Discussions 0
S Measurement Terms - How do I say this in Inches? General Measurement Device and Calibration Topics 5
Marc Definition MPEE - Maximum Permissible Error for Length Measurement Definitions, Acronyms, Abbreviations and Interpretations Listed Alphabetically 0

Similar threads

Top Bottom