Do Americans use metric system?

Ad: This forum contains affiliate links to products on Amazon and eBay. More information in Terms and rules

if you indicate a dimension of 3230 mm the precision achieved in the finished product must be +/- 0,5 mm and an indication of 3230,0 mm so on...
Things may be different in the trades but IF the measurement is TRULY listed as 3230 mm then the terminal zero is NOT SIGNIFICANT. It is present only as a place-holder to move the left-most 3 into the tens column. That makes the leftmost 3 an UNCERTAIN digit. The ruler used to make that measurement had a RESOLUTION of 100 mm (had actual marks every 100 mm) There were no marks on the ruler for Tens or Ones. Thus when the measurement was made the actual object ended between the 3300 and 3200 marks. I estimated that the amount above the 3200 mark was about 30 so I wrote 3200 + 30 = 3230 mm and the error would be half the uncertain digit or +/- 5 mm. So the object is somewhere between 3235 and 3225 mm.
If you want that terminal zero to be significant then place a decimal point after it or a bar over it. So listing the measurement as 3230. or 3230, is a more accurate measurement. This ruler had marks for thousand - hundreds - and tens but no ones marks so I estimated that the ones reading was 0 with an error of +/- 0.5mm
Now I certainly can specify in my plans a length of 3230 mm with a stated non-standard error of +/-0.5 but it has to be stated. Stating the error means than you can call for any precision you want 3230 mm +/- 0.2mm for example
 
Things may be different in the trades but IF the measurement is TRULY listed as 3230 mm then the terminal zero is NOT SIGNIFICANT. It is present only as a place-holder to move the left-most 3 into the tens column. That makes the leftmost 3 an UNCERTAIN digit. The ruler used to make that measurement had a RESOLUTION of 100 mm (had actual marks every 100 mm) There were no marks on the ruler for Tens or Ones. Thus when the measurement was made the actual object ended between the 3300 and 3200 marks. I estimated that the amount above the 3200 mark was about 30 so I wrote 3200 + 30 = 3230 mm and the error would be half the uncertain digit or +/- 5 mm. So the object is somewhere between 3235 and 3225 mm.
If you want that terminal zero to be significant then place a decimal point after it or a bar over it. So listing the measurement as 3230. or 3230, is a more accurate measurement. This ruler had marks for thousand - hundreds - and tens but no ones marks so I estimated that the ones reading was 0 with an error of +/- 0.5mm
Now I certainly can specify in my plans a length of 3230 mm with a stated non-standard error of +/-0.5 but it has to be stated. Stating the error means than you can call for any precision you want 3230 mm +/- 0.2mm for example
Why do you have such a special talent for turning a light hearted discussion into an oppressive bore? You are wrong because you are stating what you understand as the conventions you know in your world, other people know other conventions in theirs. In all cases the defining authority is the buyer and I have been involved in very heated and serious discussions between people buying and selling in specifications which were in decimal notation while the laboratory charged with testing only quoted values in PPM. Whatever your opinion is, specifications agreed by buyer and seller decide what is what, you are merely discussing your version of the defaults.
 
This discussion is one of the types that, for me, makes participating in this forum enjoyable. Full of differing views, sometimes odd and esoteric information, useful information, and humor. Thank you all.

Possibly you will find the following interesting. It is my take on what occurred during my career as an engineer/fabricator.

In early- to mid-1970s high school (grades 7-12 in my neighborhood) I learned of something strange (to some of the teachers) and wondrous (to some of the other teachers) called the metric system. Basically, the shop (metal and wood) teachers thought "WTF", while the science (chemistry, physics, biology) teachers thought "yay!, we are moving into the future!".

At university (my majors were mechanical engineering and physics) and vocational-trade school (training for machining and other forms of fabricating) in the late-70s, these views were further expanded upon.

The engineering teachers had the attitude of "this would be a good idea if it offered any significant improvement in the ability to design and produce something, since it does neither...WTF".

The vocational-trade school teachers said "WTF, since every piece of equipment we have here is graduated/marked in English Standard, and since we are unlikely to receive any funding to replace/convert the WWII era surplus machines we have, WTF again".

The Physics and other science teachers said "yay!, we are moving into the future, really slowly, but moving we are".

Once I was in the working environment the views became somewhat more diverse as time passed. At my first real job in my chosen professions (at the Chrysler tank plant in Lima, Ohio) I was assigned to the alternate diesel power plant project for the M1 Abrams, then (at a low level) in various aspects of the pre-production design modification and trials, after which I was then transferred to various manufacturing aspects of the production turret weldments and assemblies.

At Chrysler the majority view of the engineers (via the executive types) was "it has become necessary that we can still design and manufacture items using English Standard, but dimension/tolerance them in both the English and Metric system, so that our allies across the pond can take part in a shared production environment."

The fabrication personal thought "WTF, the Europeans must be too inept to convert inches to mm".

After leaving Teledyne and Chrysler (in 1983) I bounced around from company to company for the next 30 years or so, sometimes working as an engineer and sometimes as a fabricator (usually a machinist) and was exposed to continuing changes in attitude.

As an engineer it could be summed up as "our economy is so entwined with the rest of the world's industrialized nations that we need to get with the times...oh well".

As a machinist (aside from when manufacturing military items) I did not start to run into any perceptible acceptance of the Metric system in general manufacturing until the mid-1990s, when the international trade system began to effect the US economy more, and more of the parts being produced in the US were to be sold in the Europe. The basic attitude was "well hey, WTF, if we are going to make money we need to get with the times... slowly. But why can't we make the other industrialized nations use the English system?".

When I took continuing education classes at university and vocational/trade school, the university science teachers had the attitude of "yay, we can pat ourselves on the back, the sciences at least have joined the modern era!".

The vocational/trade school teachers (who had become part of the university system during the early- to mid-1990s) said "it is a great idea (not really, but we are willing to try). When are we going to be given funds to replace the WWII surplus manual lathes, knee mills, horizontal mills, surface grinders, cylindrical grinders, centerless grinders, tool grinders, jig grinders, jig borers, and the newer (but out-dated) CNC knee mills, CNC lathes, CNC machining centers, CNC turning centers,....etc that are all graduated in inches?". (In 1993-94 I taught substitute and part time in the machining department of a Technical College (aka vocational/trade school). While I was there, we were able to procure a new (new-out-of-the-box) 1947 LeBlond lathe.)

In the late-1990s to early-2000s I worked in several small to medium machine job shops (ie they did not specialize in any particular type of machining, being able to do prototyping, short-run production, and long-run production). 99% of the blueprints we received were dimensioned in inches, with about 30% or so using Geometric Tolerancing. Nearly all of of the medical industry specific blue prints had been converted from Metric to English dimensions specifically for the use of the machine shops.

Also in the early-2000s, I worked several contract engineering jobs in what might be considered specialized areas. One of them involved a large medical device company. They had recently had major problems with quality control in some of their products. My job was specific to teaching continuing education (ie remedial) classes in English-to-Metric/Metric-to-English blueprint conversion, engineering application tolerancing, Geometric Tolerancing, and SPC (Statistical Process Control) and TQM (Total Quality Management) in particular where the applications involved manufacturing and inspection of English-to-Metric/Metric-to-English requirements and Geometric Tolerancing.

The attitude of the engineers was good natured tolerance for the most part, except for the ones who were responsible for the problems...they exhibited resentment.

The attitude of the people in manufacturing and quality control departments was a willingness to learn and most were somewhat enthusiastic...except for the higher level supervisors...they exhibited resentment.

sigh
 
Last edited:
Why do you have such a special talent for turning a light hearted discussion into an oppressive bore? You are wrong because you are stating what you understand as the conventions you know in your world, other people know other conventions in theirs. In all cases the defining authority is the buyer and I have been involved in very heated and serious discussions between people buying and selling in specifications which were in decimal notation while the laboratory charged with testing only quoted values in PPM. Whatever your opinion is, specifications agreed by buyer and seller decide what is what, you are merely discussing your version of the defaults.

Exactly. Some people I knew had a quarry extracting granite slabs and accepted a contract with some Australian contractors for a skyscraper but, rather lightly, they did not notice (or didn't care) that in the contract a tolerance of +/- 0 mm was stated for the thickness of the slabs. This was clearly a swindle. They had a hell of a trouble to be payed by the buyer ( and don't know if they have been payed indeed).
 
Last edited:
I made these drawings very, very early in my career (pre 1980). No computers then! Everything had to be made by pencil, squares and ruler. I bought the first computer able to draw and the firts plotter (an HP pen plotter) in 1983.

Sorry for the poor photos of the prints.
A steel truss, where dimensions are in mm: in a few cases, when calculations suggested it, an indication of 0,5 mm in the dimension was made
capriata insieme.jpg

capriata particolare.jpg

And a concrete bridge where dimension are or in meters or in cm, as that is the precision that it is possible to get with that material

Pila ponte.jpg

As far as I know these structures are still there, if Lybians did not blast them..
 
Last edited:
I'd like to tell you some facts about the design of these towers, in which I was involved in the same years ( early '80s) together with a Canadian structural engineering firm, but unfortunately I've not the time now...

8497_immagine_540x500.jpg
 
The goal of SI is to eliminate all of these various regional conventions and ways of doing things and adopt universal standards. I was never in the practical construction or engineering end of things so from the scientific end of the scale, it was always YEA METRIC. When SI took control in 1960 it was a lower case "yea" mostly because of the elimination of units not derived from the seven base units and the adoption of derived units like Pascals which, IMHO, are impractical in the man-sized world. Gone were all those nice pratical units like Angstroms, mm of Hg, Gauss, Lumens, etc.
The other changes that I noted and which I still see here on the forum is the use of Commas for Decimal Points. Seeing things like 0,05 mm is very strange to me and is one of the things addressed by SI. Here in the States we always used Periods for Decimal Points so I grew up with 0.05 mm while Europeans were doing 0,05 mm. Again in the States we did use Commas but only to separate groups of zeros into threes so: 1,000,000,000. SI addressed this by eliminating the comma decimal point and the commas between groups of three zeros. So in SI America got 0.05 but lost the commas in large numbers so we are now supposed to write 1000 000 000 and even after the decimal 0.000 000 000 05 we're supposed to group into threes.
Uncertainty in measurement was an easy topic in Ye Olde Days of Yore because Slide Rules only allowed 3 digits at most on most scales. Once electronic calculators came into play it became very difficult to convince students that eight or more of those digits in your calculation are meaningless garbage being simply error times error times error divided by error. So learning that 3230 mm; 3230. mm; 3230.0 mm; and 3230.00 mm are all radically different measurements made with radically different measuring "sticks" with a built-in Standard Error is a difficult concept for some to grasp. Significant Figures and Uncertainty in measurement and its effect on calculation takes lots of time.
Then Heisenberg's Uncertainty Principle raises its ugly little head and it all goes to Gehenna in a hand-basket.
 
"......We built Apollo using inches and put a man on the Moon.
They built the Space Shuttle with the metric system and it blew up..."
 
Uncertainty in measurement was an easy topic in Ye Olde Days of Yore because Slide Rules only allowed 3 digits at most on most scales. Once electronic calculators came into play it became very difficult to convince students that eight or more of those digits in your calculation are meaningless garbage being simply error times error times error divided by error. So learning that 3230 mm; 3230. mm; 3230.0 mm; and 3230.00 mm are all radically different measurements made with radically different measuring "sticks" with a built-in Standard Error is a difficult concept for some to grasp. Significant Figures and Uncertainty in measurement and its effect on calculation takes lots of time.
.
This is true but just as calculators and computers improved things they also introduce another problem. Many formatted test certificates quote all data to the same number of decimal places on chemical analysis, so all values will have 4 decimal places even if the method of testing doesn't test to that level of accuracy. A digital ultrasonic testing machine will test and show a result to as many decimal places as the manufacturer wishes to make it show (I have seen three places) but on stainless steels you have no way to calibrate the machine to anything like that accuracy. Micrometers are now digital and will read to 2 decimal places maybe more but unless it is a precision machined parallel piece you will never get the same reading twice because thickness itself is not consistent without the extra problem of ensuring it is seated correctly on a curved surface. Hardness testing has also been computerised, with the load applied hydraulically and a computer evaluating the indentation. Unfortunately in both Mannesmann GRW and the SVM research institute these miracles of technology produced results that were physically impossible despite being to an extra decimal place.
 
Pbehn - As I posted earlier I'm not in any of the trades or engineering so I can only go by the standards we were held to at the University when I was working on my degrees. Any and all test machines had manufacturer stated limits of accuracy and a result could only be as accurate as the least accurate measurement. In addition/subtraction all measurements had to be rounded to the greatest uncertainty before adding/subtracting. Multiplication/Division results were rounded to the same number of significant figures as the least number in the original data.
In a series of measurements Precision was another issue and final results were subject to a number of statistical analysis to determine whether they were statistically significant.
As of May 2019 SI has fixed the values of several fundamental physical "constants" so that they are now actually constant at least within a range of >20 parts per billion and in some cases >10 parts per billion. So, for example from the results obtained from 6 different Kibble Balances, Planck's Constant is now fixed at 6.62607015 X 10^-34 kg⋅m2/s with a 10 parts per billion uncertainty.
 
Several years ago the TransAmerican Free Trade Route (or something) was floated here in Arizona Territory with klics from the Border to somewhere around Tucson, changing to miles thereafter. (Maybe it was supposed to go as far as Vegas, baby.) At least that's what I recall from my phone inquiry to the relevant office. The civil servant on the other end was absolutely evangelical: offered/insisted on sending me Literature about the program when all I wanted was to know was whether the report I heard was true. Since then I've not been south of Tucson very much, and don't know how the "mile" markers are measured.

Then there was the Japanese Navy which, I believe, measured distance in nautical miles, speeds in knots, and altitude in meters hooboy...
 
Then there was the Japanese Navy which, I believe, measured distance in nautical miles, speeds in knots, and altitude in meters hooboy...
The Japanese Navy was formed with the cooperation of the British when they were allies having distance and speed in nautical miles and knots isn't difficult to understand for a navy, a knot is a nautical mile per hour isn't it. It is one of the few measures based on the dimensions of the globe we actually live on the name "mile" is almost coincidental they are different units. A nautical mile is a unit of measurement used in both air and marine navigation,[2] and for the definition of territorial waters.[3] Historically, it was defined as one minute (1/60 of a degree) of latitude along any line of longitude.
 
Things may be different in the trades but IF the measurement is TRULY listed as 3230 mm then the terminal zero is NOT SIGNIFICANT. It is present only as a place-holder to move the left-most 3 into the tens column.

Maybe in calculations the 0 is not significant, but when two things have to fit together it is very significant.


The ruler used to make that measurement had a RESOLUTION of 100 mm (had actual marks every 100 mm) There were no marks on the ruler for Tens or Ones. Thus when the measurement was made the actual object ended between the 3300 and 3200 marks. I estimated that the amount above the 3200 mark was about 30 so I wrote 3200 + 30 = 3230 mm and the error would be half the uncertain digit or +/- 5 mm. So the object is somewhere between 3235 and 3225 mm.

Where did that come from?

Elmas mentioned that dimension in terms of a drawing. If you are making something according to a drawing dimensioned in mm you better not be using a measurement device with a resolution of 100mm. And you should not be estimating measurements.
 
Maybe in calculations the 0 is not significant, but when two things have to fit together it is very significant.

Where did that come from?
Elmas mentioned that dimension in terms of a drawing. If you are making something according to a drawing dimensioned in mm you better not be using a measurement device with a resolution of 100mm. And you should not be estimating measurements.

Thanks Wuzak for the aid in explanation.
I didn't reply to some post because, to explain, I had to start from scratch.
So, let's begin...

ERRORS

Errors are normally classified in three categories: systematic errors, random errors, and blunders.
Systematic Errors
Systematic errors are due to identified causes and can, in principle, be eliminated. Errors of this type result in measured values that are consistently too high or consistently too low. Systematic errors may be of four kinds:
1. Instrumental. For example, a poorly calibrated instrument such as a thermometer that reads 102° C when immersed in boiling water and 2° C when immersed in ice water at atmospheric pressure. Such a thermometer would result in measured values that are consistently too high.
2. Observational. For example, parallax in reading a meter scale.
3. Environmental. For example, an electrical power ìbrown outî that causes measured currents to be consistently too low.
4. Theoretical. Due to simplification of the model system or approximations in the equations describing it. For example, if your theory says that the temperature of the surrounding will not affect the readings taken when it actually does, then this factor will introduce a source of error.

Random Errors
Random errors are positive and negative fluctuations that cause about one-half of the measurements to be too high and one-half to be too low. Sources of random errors cannot always be identified. Possible sources of random errors are as follows:
1. Observational. For example, errors in judgment of an observer when reading the scale of a measuring device to the smallest division.
2. Environmental. For example, unpredictable fluctuations in line voltage, temperature, or mechanical vibrations of equipment.
Random errors, unlike systematic errors, can often be quantified by statistical analysis, therefore, the effects of random errors on the quantity or physical law under investigation can often be determined.
Example to distinguish between systematic and random errors is suppose that you use a stop watch to measure the time required for ten oscillations of a pendulum. One source of error will be your reaction time in starting and stopping the watch. During one measurement you may start early and stop late; on the next you may reverse these errors. These are random errors if both situations are equally likely. Repeated measurements produce a series of times that are all slightly different. They vary in random vary about an average value.
If a systematic error is also included for example, your stop watch is not starting from zero, then your measurements will vary, not about the average value, but about a displaced value.

Blunders
A final source of error, called a blunder, is an outright mistake. A person may record a wrong value, misread a scale, forget a digit when reading a scale or recording a measurement, or make a similar blunder. These blunder should stick out like sore thumbs if we make multiple measurements or if one person checks the work of another. Blunders should not be included in the analysis of data.

the rest is here
New Mexico State University - Department of Physics
 
And you should not be estimating measurements.
Since people are not perfect and neither are their machines it goes without saying that all measurements have UNCERTAINTY and in a correctly made measurement that uncertainty is in the terminal digit.
Next the edge (or whatever you are measuring) will always end up between two marked (on the instrument doing the measuring) marked lines. When you read the measurement from the instrument you start from the largest marked point and work your way down one marked line at a time until you come to the two lines that the actual object edge lies between. Here is where you ESTIMATE the value. There are no marked lines here so you have to estimate. So all correctly made measurement contain all digits known with certainty and ONE uncertain digit. These digits 1-9 are SIGNIFICANT DIGITS. ZERO however has TWO functions. ONE of those functions is its use as a PLACEHOLDER. Placeholder zeros are NEVER Significant as they were never read from an instrument. In 93,000,000 miles only the 9 and the 3 are significant the six zeros are place holders and the 3 digit was estimated and is a uncertain digit.
IF a terminal zero is to be significant you have to specifically indicate it so in 3450 cm the zero is a placeholder and the 5 is uncertain and there are 3 significant figures but in 3450. cm the zero is significant and is the uncertain digit so there are 4 significant figures
 
"......We built Apollo using inches and put a man on the Moon.
They built the Space Shuttle with the metric system and it blew up..."
I'm not aware of any evidence that the unit of measure used during the respective construction process of the two was ever suspected as the cause of the failures.
 

Users who are viewing this thread

Back