Search

Got Metrology?

Category

Measurement Talk…

Summertime Heat offers Measurement Challenges

In the world of dimensional measurement, temperature changes can create challenges to obtaining accurate and precise measurements. For those individuals who must work in environments that are not climate controlled precision measurements can be extremely challenging when the summer months come along, not only are they faced with their own physical challenges of the often overbearing heat, but also the challenges of parts that change size with every tick of the clock.CMSC Weather Forecast

With the advancement in measurement systems and software’s the operators have the ability to control the effects of temperature on their measurement equipment and the parts they measure, however, proper application of these controls is critical to achieving the desired measurement results. Compensation of Thermal Expansion allows Measurement Technicians to correct for changes in part temperature from their “nominal” size (68 deg Fahrenheit or 20 degree Celsius is nominal for most materials), however the Measurement Technician must be aware of when to apply this compensation and how to do it. Unfortunately, this is not always as easy as it should be as every software and every piece of measurement hardware works differently. Some measurement equipment requires Technicians to enter a Temperature for the device, this does not mean that it is compensating for the temperature of the part being measured. If the operator enters a part temperature for the device it will also yield inaccurate measurement data.

So what does all this mean? If you are measuring parts in uncontrolled environments and you are not sure how to properly correct for the temperature and you are looking for accurate and precise measurement data, contact your measurement hardware and software provider for details on properly correcting for the temperature. Note: many devices do not require an air temperature for operation as they are unaffected by temperature (articulating arms as an example) while others typically using Lasers are affected by not only temperature but air pressure and must have accurate values to achieve accurate results.

For those Measurement Technicians out there in the uncontrolled environments working in the heat, stay hydrated and keep measuring!!

Stay hydrated!
Advertisements

How much Data is enough Data?

At the beginning of my career I worked at a large aircraft manufacturer in the Northwest as a Jig Builder; fabricating jigs that ranged from small tabletop size to jigs that were over a hundred feet long and four stories tall. I learned to measure with Vernier Micrometers, Calipers, Height Gages, “Suitcase” Micrometers, Ball Gages, Adjustable Parallels, Transits, Optical Levels, Alignment Scopes etc….Single dimension measurements….it took multiple setups and a great deal of time and skill to properly set tooling details to the tolerances we were working back then (+/- 0.010 inches) and occasionally working as close as +/- 0.005 inches.

Working with single dimension measurement devices often required the use of three different types of measurement devices, three different indexing setups to set a detail in all three directions (X, Y, Z) or in aircraft terms; Station, Buttline, Waterline. This is how Aircraft Tooling was typically built until the early 1990’s when Computer Aided Measurement Systems became user friendly enough for shop use. For decades prior to 1990 Tooling was a highly skilled profession requiring the “Toolmaker/Jig Builder” to have a detailed understanding of several types of measurement equipment and how to properly utilize them in order to fabricate the tooling to the tolerances required. The Toolmaker/Jig Builder would also be called upon to verify aircraft parts to tolerances of 0.030″, this might include taking measurements at specific locations on the “skin” of the aircraft by holding by hand optical scales that were read with two different measurement devices, a time consuming task.

In the late 80’s early 90’s Computer Aided Theodolites (CAT) became available and quickly became the “measurement device” of choice for building tooling. The CAT Systems allowed the Toolmaker/Jig Builder to measure all three dimensions from one setup for the first time ever. While these devices increased the time in which tooling could be fabricated, they still required a great deal of skill to obtain the precision and accuracy required for aircraft tooling. Verification of aircraft parts could now be accomplished by placing sticker targets on the aircraft and measuring them with at least two theodolites (they are a triangulation based system).

By the mid 90’s Laser Trackers made their appearance and as fast as the CAT System appeared it quickly became obsolete as the Laser Tracker could provide real time 3D Measurements, Laser Trackers were not as difficult to setup as the CAT system or earlier one dimensional measurement devices and were able to achieve a much higher degree of accuracy and quickly became the measurement device of choice all around the world for Tool Fabrication in not only aerospace but the automotive industry.

The arrival of the Laser Tracker with improved accuracy and the ability to measure 1000 points per second, also brought about big changes in tooling tolerances, suddenly, the +/- 0.010 inch tolerances of the past, were no longer acceptable, with many Engineers expecting that tooling could be built to the tolerance of the Laser Tracker (=/- 0.0020 inches)….Aircraft parts could now be measured by “scrubbing” the surface with the Laser Trackers spherical mounted reflector (SMR) collecting thousands of points very quickly and the ability to characterize surfaces faster and more accurately then ever before.

By the end of the 20th century Laser Scanners of all varieties became available and now millions of points could be collected very quickly without having to contact the surface of the part being measured.

So with all of the technology, the ability to measure faster, collect more data Tooling and Manufacturing have improved dramatically in the aircraft industry, however, the overall result may be even better if the Engineering requirements in some areas were a little more reasonable.

The ability to measure more accurately and precisely should not always equate to the “tightening up” of tolerances as has been seen worldwide, the reality is our measurements are more accurate then back in the 1D measurement device days. When tolerances are “tightened” in almost every situation it equates to a higher cost for the end product as more time is required to meet the new tolerance. The ability to measure millions of points on parts that were previously measured with hundreds of points does not always mean a better product as the data must be analyzed and deciphered, sometimes the “overload” of data can actually stop production as a decision is made on if the part is good or bad.

Having spent a lifetime working in measurement I have enjoyed being on the cutting edge of the new measurement technologies and would not trade it for another profession, but I do think back on the “old days” occasionally and wonder; how much data, is enough data? There are many airplanes and automobiles still operational today that were built using 1D measurement devices, while clearly not as accurate as today’s products, are we getting to the point of data overload? Are we expecting too much of our measurement systems given the limitations of manufacturing materials and environments?

Practical Metrology: Laser Trackers

Basic use and application of Laser Trackers in manufacturing from Quality Magazine.

Practical Metrology: Laser Trackers | 2013-05-03 | Quality Magazine.

A Chain will be used to determine Winners/Losers in the NFL Playoffs

Here in the United States our Football is our american sport starting in late summer, we love our baseball, but come Football season the chants of “are you ready for some football” can be heard everywhere. What does that have to do with Metrology? Well, nothing….yet…..

Logo of the National Football League Playoffs,...
Logo of the National Football League Playoffs, 2011–present (Photo credit: Wikipedia)

In the American game of Football the goal is to move the ball down the field to the end zone, the team with the ball is given four chances (“plays”) to advance the ball 10 yards, if they advance it the minimum 10 yards they are rewarded with another four chances (plays) etc, until they reach the “end zone” where a touchdown and 7 points awaits. This ten yards is where Metrology should come in to play…..

Over the last several years the american game of football has used technology to improve the game by adding “instant replay” to determine if the Referee’s made the correct call or to determine if the ball was placed in the correct location on the field. In the National Football League (NFL) and in Colleges speakers have been added to the Quarterback’s helmet (Quarterback is the on field “General” for the Offensive unit) and to the Linebacker’s helmet (Linebacker is the on field “General” for the Defensive unit) so that their coaches can communicate with them so that they can make adjustments in real-time. The use of technologies in the players uniforms, shoes, helmets have improved safety and comfort of the players. Even the balls are kept at an exact temperature to ensure that when they are being kicked everything is “equal”, team benches are heated or cooled depending on the climate, technology is everywhere except…..

The aforementioned measurement of the 10 yards, what the ENTIRE games is based on that 10 yard increment. This 10 yard increment has the largest impact on a game of any one component, millions of dollars are won and lost based on that 10 yard increment, that 10 yards or 30 feet, carries a lot of weight. Jobs are won and lost based on that 10 yards, that 10 yards is everything….so how do you think it is measured in the year 2010 when the ability to measure 30 feet (9.1 meters) is down to the microns? Lasers? Total Stations? Some other sophisticated measurement system? NOPE, sorry the use of two poles and a chain….Sounds like Land Surveying dating as far back as history takes us right?

Measurement Method of the First Down

Several times during an american football game the “chain gang” will be called from the sideline, dragging out their poles and chains and the Referee will by hand “index” the pole to line of chalk, this is where the measurement is taken from, the chain is “stretched” to the tip of the football, to see if the “nose” of the ball exceeds the pole at the other end 10 yards (9.1 meters) away. This is how that all important measurement is taken, to those of us involved in the science of measurement the possibility for inaccurate measurements are endless, but yet, this is what american Football is based on, this has been the way since 1906.

When the Vice President of  officials from the National Football League was asked about this two years ago; he stated that he did not think it was perfectly accurate, and yet nearly two years later there is no advancement in how this crucial measurement is taken, why is that?

To be clear on the procedure that is used here is a quick synopsis;

On a first down, one end of the chains is placed along the sideline by one member of the seven-person chain gang — hired for game-day duty by the home teams — six feet from the field, supposedly even with the front tip of a football (“eyeballed”)  that will be snapped at least 25 yards away. When a play ends, an official estimates the spot, usually marking it with a foot and tossing the ball to another official to set for the next play. When a first down is too close to call, the chains are brought onto the field.

Sometimes the “drive” by the offensive team continues by an inch. Sometimes it ends by less.

There have been several attempts at innovating this process, however, it seems that the “ritual” of calling the markers onto the field, with the suspense it brings as the crowd quiets and watches the referee suddenly concerned with accuracy hold the ball still as the chains are stretched is what stops technology from intervening. The “american football gods” will say that the newer methods are unproven, and could lead to errors in the game, however, these same people would not want to hear how the human eye leads to errors when attempting to align an object 75 feet away.

Perhaps, someone reading this will come up with the system that will change the way the “first down” is measured!

Compensating for Thermal Expansion – does size matter?

 In today’s manufacturing world the improvements in technology have allowed us to accurately and precisely design, manufacture and inspect parts and assemblies to tighter tolerances then at anytime in history.

Many applications do not require any Compensation for Thermal Expansion (CTE) due to size, type of material or design/engineering requirements. However, many applications benefit from applying CTE due to the material type and the environment. In general, the larger the part or object being measured the bigger the affect of thermal changes to the material of the part/object.

During Inspection and fabrication (particularly automotive/aerospace tooling), the temperature of the object being measured can have a major impact on the measurement results. Understanding the proper application of the Compensation for Thermal Expansion will result in more repeatable measurements regardless of the objects temperature.

Measurement Devices and the Environment

Many measurement devices that utilize laser interferometry require accurate environmental measurements of temperature, air pressure and humidity to acquire accurate measurement data. The correction for environmental factors; temperature, air pressure, humidity are not required for the operation of  some measurement systems for example; Articulating Arms, Photogrammetry and  CMM’s.

When using a measurement system with a laser interferometer or distance meter of some variety for distance measurement (Laser Trackers most common, Total Stations)  the temperature and air pressure that is entered for the initialization process of the device is used to correct the instrument measurements back to the standard 68° F or 20° C. This is particularly important to note when point to point distances will be measured on an object that is not  68° F or 20° C.

It is recommended that you refer to the Manufacturers recommendations.

When to Compensate

The type of measurement task will determine whether Compensation of Thermal Expansion (CTE) must be used and when the CTE should be applied.

Common Applications

  • Object is constructed of material that is not thermally stable.
  • Tight tolerances in environment where temperature is not controlled
  • Large objects
  • Establishing Large Measurement Networks on Large Structures

Applying CTE

The actual application of Compensation of Thermal Expansion (CTE) is unique in nearly every Measurement Software on the market. To properly apply CTE in the software that is being used the operator MUST understand the following to be successful:

  • Type of Material – the exact type of material should be known in order to accurately calculate the CTE of the object.
  • Part Temperature – depending on the required accuracy of the measurement task a variety of devices are available to read temperatures. For small parts, one reading is sufficient when the task is Large Volume Measurement a number of readings may be required and an average determined. If the object is tall, readings should be taken at several altitudes.
  • What type of ‘Part Alignment’ will be used – the sophisticated measurement software’s on the market provide the power of applying CTE through 7 parameter transformation matrixes. This provides the ability to ‘Align’ to the part using a ‘Best Fit’ algorithm, which fits Translations, Rotations and Scale, based on Nominal data. A keen understanding of how the Nominal data was established is required prior to ‘Scaling’ an Alignment based on the Nominal information. Nominal data that was not properly valued will yield inaccurate to catastrophic results.
  • If a ‘minimum fit’ Alignment will be used when should scale be applied – this varies in measurement software’s as some software’s require CTE be applied prior to this type of Alignment, others after the Alignment has been calculated. See Software documentation for best practices.
  • Does the CTE need to be applied before or after Data Collection – this is key as some software’s apply the CTE as the data comes in to the software, while others apply it to the measured data during post processing.

Verifying ‘Scale’

When applying a Compensation of Thermal Expansion the ‘Scale’ of the measurement task or dataset is changed. When using a 7-parameter ‘Best Fit’ Transformation this ‘Scale’ factor should be verified in particular on Large Scale measurement tasks. Scale Factors are typically displayed as a ratio of the active Units of Measure, this ratio needs to be considered over the entire Measurement ‘Envelope’, a common mistake is to review ONLY the X, Y, Z, 3D deviations after Scale has been applied.

To properly evaluate the CTE a calibrated ‘artifact’ made of the same type of material must be used, the ‘artifact’ must also be the same temperature as the part being measured, for these reason’s many facilities are unable to properly evaluate the application of CTE.

Using a Certified Artifact

One method for verifying Scale is to use some type of ‘Artifact’ this is typically a ‘Scale Bar’ or ‘Ball Bar’, the artifact must be the same type of material as the part being measured, and be the same temperature.

Certified Ball Bars

Scale Bars/Ball Bars can be made from a wide variety of materials with the most common available are Aluminum, Steel, Composite and Invar.

Ball Bars have highly accurate spheres fixed to each end. These Ball Bars typically have a calibrated distance between the two spheres.

Scale Bars typically have magnetic mounts at each end to accept Laser Tracker Spherically Mounted Reflectors (SMR’s) or highly accurate Sphere’s. The distance between the points is calibrated by using highly accurate Laser Interferometers.

Certified Scale Bars

How to Verify Scale – When an artifact of the same type of material is available, it must be thermally ‘soaked’ to the same temperature as the object being measured. After applying the CTE within the Measurement Software application the operator can measure Points or Spheres (depending on artifact) and simply check the point to point distance. Dependent on the accuracy of the measurement system being used, the result should typically be within 0.0020 inches or 0.050 mm. The results will vary depending on the device being used, and the accuracy of the artifact being used.

Analyzing the Results – when point to point distances vary from the Calibrated Artifact significantly, the operator must determine what the overall impact to the measurement task will be. If the artifact is significantly bigger than the part will the scale impact be acceptable within the parts envelope? If using a 7 parameter ‘Best Fit’ Transformation is the ‘nominal’ information accurate? If the ‘nominal’ information was derived from a previous measurement; was CTE accounted for correctly?

The use and application of CTE can improve the accuracy and repeatability of nearly every measurement task when applied correctly. Refer to your Metrology Software package documentation for details of use within that software.

Related articles

Top Read – Compensating for Thermal Expansion

The following page has received the most visits since JustMetrology.com came online in 2009
 
Thermometer with Fahrenheit units on the outer...
Image via Wikipedia

For any individual working in the Dimensional Metrology Industry the subject of Compensating for the Thermal Expansion of Materials has been an issue at one time or another. For accurate measurements consideration must be given to the type of material and the temperature of that material especially in environments where the ambient temperature is NOT 20° C or 68° F and the material is not within those temperature ranges. Granted, there are materials that are not affected by temperature change, therefore it is imperative for the Measurement Technician to be familiar with the proper application of Thermal Expansion.

In the Dimensional Metrology industry it is common for the Compensation of Thermal Expansion procedure to be called “Scale” as the Metrology Software typically applies a “Scale Factor” to adjust the data being measured and analyzed for the actual part temperature. This Scale Factor can be applied in various fashions, from “Scaling” the Measured Data, “Scaling” the Measurement Equipment, to “Scaling” a Least Squares Transformation thus creating the 7-parameter Transformation. Each Metrology Software uses specific processes for applying “Scale” the Technician should be familiar with the best process for the Measurement application to make the best decision for how to apply the “Compensation of Thermal Expansion”, incorrect application of “Scale” (Compensation of Thermal Expansion) can yield disastrous results great care should be taken when measuring part temperature, determining type of Material and the method utilized to apply the “Scale”.

Why Scale Matters…..

During the year the ambient temperature in many manufacturing areas throughout the world change with the seasons, thus Winter months are much cooler in the Manufacturing area then it is in the Summer months. Although this change in temperature may only be 6-8° F the change in the materials depending on the size of the object can be significant. Consider this: In simple terms, Aluminum changes 0.001″ inches in length for EVERY degree in Fahrenheit above or below 68° F. What this means is that a structure that is 240.000 inches in length, changes 0.0024″ for EVERY degree it varies from 68° Fahrenheit. Given that many of the Accuracy requirements for the Dimensional Metrology industry these days is in the neighborhood of 0.0020″ inches the requirement to account for that change in temperature of the material should be easily understood. This values are general in nature and vary depending on the exact type of Aluminum being used.

With correct application of “Scale” the temperature of the part becomes the “Same” by using the Metrology Software.

But my Equipment Compensates for the Temperature Automatically…..

Or does it? Typically, the Measurement Systems make adjustments to internal distance measurement components due to the ambient temperature they are being utilized in. For instance; Laser Trackers, Total Stations, Scanners may require the Measurement Technician to enter a Temperature, this temperature is utilized to adjust the Measurement Equipment to take accurate measurements however it makes no adjustment for the material being measured. When a system compensates for the Material the Technician is prompted for both the Type of Material and the Part Temperature without this information it is NOT possible to accurately calculate and apply a Scale Factor.

If the system being utilized requires the use of a Certified Scale Bar to determine Scale the equipment can provide accurate “Scale” ONLY when the Scale Bar Material and Temperature are the SAME as the part being measured.

When do I apply the “Scale” Factor?

The answer to this question is that each software uses different processes to perform the Scale process. It is best to consultant an Expert on the Software to answer this question as the wrong information can have catastrophic results.

Scaling during a Least Squares Transformation

This use of “Scale” requires that the operator have X, Y, Z values for previously measured points that had the Compensation of Thermal Expansion applied correctly during the initial measurement process.

This type of Transformation is common when building large Tooling structures in the Aerospace industry as it allows the Measurement Technician to apply the “Scale Factor” one time when valuing the points and then Exporting point data to be used as Nominal Points throughout the life of the Tool. As the temperature changes the relationship of the points to themselves changes, the Least Squares Transformation process accounts for these changes automatically when using a 7 parameter transformation.

If the Measurement Technician is faced with a measurement task that includes limited constraints for the available points the operator would typically “Fix” the Scale Factor. This method varies between Metrology Software’s, it is advised that prior to utilizing this type of Transformation which requires “weighting” of points that the Technician consult a Software Expert.

How do I know if Scale is right?

A simple check that has become a standard in the Aircraft Manufacturing Industry is to measure a Certified Scale Bar that is the same material and temperature of the part being measured. When Scale has been applied correctly the measurement equipment should be able to measure the Certified Scale Bar accurately (some shops allow up to +/-0.0030 inches of deviation). Dependent on the type of equipment, measurement volume and the measurement task requirements you deviations will vary. CAUTION:Your certified Scale Bar should measure within the expected accuracy expectations for the Measurement task.

If you do not have a Certified Scale Bar of the same material, it is only possible to verify that the Measurement Equipment is measuring “Scale” correctly. By measuring the points on the Certified Scale Bar in several different positions the Technician can assess the accuracy of the Measurement System. It is critical that prior to checking point to point distances the operator correctly account for the material type and material temperature of the Certified Scale Bar.

Metrology?

Growing up in the “jet city” I have had the advantage of being exposed to airplanes from every conceivable angle; watching 707’s fly 150 feet above me while I laid on the grass in my grandparents backyard, so loud you had to cover your ears to helping assemble 737 aircraft as a young man. It is this very aircraft industry that has provided me with so many opportunities in my life I would have otherwise not had. Growing up in what was then a small farming town (now a booming metropolis), I never dreamed that the skills I would learn as a young man building aircraft jigs and fixtures would end up taking me all around the world and meeting some of the finest people in the world.

My day today was spent working a booth at a trade show in a local Airplane museum, many of the attendee’s had little knowledge of precision measurement or “metrology” so again I spent a good part of the day educating people on what precision measurement does for industries such as the aircraft industry. As I prepared to leave and was touring the museum I found this display no more than 50 feet from where I spent my day….coincidence?? I think not……there are many of us around that look at this picture and smile, and say to ourselves….yep, that’s where I started…….(apologize for the picture quality…).

Also see this classic post; Metrologist??

Checking Laser Tracker Accuracy

The Laser Tracker is considered the most accurate large scale metrology device available for industrial measurements. These devices were developed in the early 1990’s and are based on using angle encoders and a laser interferometer to record angle and distance measurements. Using Horizontal and Vertical angle readings combined with the highly accurate distance measurement of the interferometer 3D coordinates can be quickly calculated.

The Laser Tracker derives its name from the Laser interferometer and from its ability to “track” an optical Cornercube (often called a Spherical Mounted Reflector – SMR). In order to “track” the Cornercube (SMR) the laser interferometer must be “locked on” to the Cornercube and a known distance established. All Laser Trackers utilize a Tracker mounted reset location (often called a birdbath due to its shape) where the Cornercube is placed and the commanded to turn to the known Horizontal and Vertical angles of this index, and then the Interferometer distance is set based on previous calibration. In recent years technology has advanced to what is commonly referred to as an “Absolute Distance Meter” or ADM these ADM’s are calibrated to the Interferometer and allow the operator to “catch the beam” and set the distance of the Interferometer WITHOUT the use of a “birdbath”. In 2011, trackers are sold almost exclusively with an ADM of some variety.

As you have probably already realized there are many mechanical and electrical components in use in a Laser Tracker so understanding if the Laser Tracker is measuring accurately is of the utmost importance when taking precision measurements.

Having spent the last 17 years operating Laser Trackers of all varieties the mis-conceptions that follow these devices and how to determine if they are accurate have been far and wide. When Laser Trackers were first introduced we viewed them as a faster more accurate form of the Computer Aided Theodolite System, these systems were completely geometry based so this view of the new technology was not accurate. Since Theodolites only measured angles this was the only instrument parameter that was ever of any concern, this could easily be checked using the “Plunge and Reverse” method of measurement to evaluate the angle encoders. With the Laser Trackers this view of the world has continued in many organizations world-wide with the phrase “all you have to do is check a backsight (akaTwo Face) to determine if the Tracker is accurate.”

Angles and Distances

Unlike the Theodolite the Laser Tracker measures a distance via the Laser Interferometer, the Laser Interferometer is the most accurate distance measurement device available to date. The Laser interferometer can also contribute the largest amount of measurement error if it is not functioning correctly or out of calibration. The angle encoders of the Laser Tracker while extremely accurate do not affect the Laser Tracker measurements in the same magnitude of the Laser Interferometer.

For many years the Laser Tracker community has depended too much on evaluating if a Laser Tracker’s angle measurements are accurate, while not evaluating the entire Laser Tracker system, simply as I view it, that it is “too complicated” to evaluate the entire system; I disagree.

Having had many “opportunities” to trouble shoot Laser Tracker measurement errors the one thing that I learned was just because the tracker measures accurately in that “one area” does not mean it is accurate throughout the Laser Trackers entire measurement envelope. I have seen many Trackers look good for all their manufacturers Field Check specifications but not repeat at all during large surveys and in particular large multiple station surveys. The following is a simple method I like to use in a shop environment to quickly and easily determine how well the Laser Tracker is measuring; of course good metrology skills must be used; temperature, air pressure are accurate, instrument and points are very stable etc.

Simple Tracker Accuracy Test

 Typically, a tracker is checked for accuracy using the Backsight/ Two Face Test and while this verifies the angular accuracy of the tracker, it does not address the distance accuracy of the tracker. An Interferometer  distance check confirms the “birdbath” distance, but does not show distance repeatability throughout the entire “measurement envelope”.

 The accuracy of Tracker measurements may be verified by measuring point-to-point distances throughout the measurement envelope. A properly calibrated tracker will repeat all distance measurements within +/-0.0020” .

 The method uses two points that will not move during the measurement process, these two points will be measured in five different orientations of the tracker. The first measurements should be done inline at approximately the same height as the tracker, and repeated twice, check the point-to-point distance on these points and average them, this will be your baseline distance. All further points will be compared to this baseline, with the total range being the difference between the longest and shortest point-to-point distance.

 The tracker will then be moved to the side and an equilateral triangle formed between the two points and the tracker. Measure the points again as different point names, rotate the tracker 90 degrees, measure the two points again, do this until the points have been measured in all quadrants.

 Check, the distances between all these measurements, total range should be within 0.0020”, if not, a Calibration should be done to correct the distance measurement errors.

Layout of Simple Tracker Accuracy Test

         Using this method should help you quickly identify the accuracy of the Laser Tracker in your Shop environment. If you can perform these tests within a controlled environment, with a recently calibrated Laser Tracker your results should be less than 0.0015″.

Accuracy vs. Precision

 To the layman the terms accuracy and precision are interchangeable, to the Metrologist these are two distinctly different definitions. The Hardware Manufacturers use these specifications as marketing tools, and seem to prefer to keep the “public” in the gray area as to how accurate and Precise their equipment really is.

Representation of high accuracy and low precision.
Representation of high accuracy and low precision. (Photo credit: Wikipedia)

To be clear the following definitions of Accuracy and Precision can be found in Wikipedia:

In the fields of science, engineering, industry and statistics, the accuracy of a measurement system is the degree of closeness of measurements of a quantity to its actual (true) value. The precision of a measurement system, also called reproducibility or repeatablity, is the degree to which repeated measurements under unchanged conditions show the same results.

It would seem that by reading these definitions it should be fairly easy to come up with specifications for Metrology Systems that would be universal, however, a quick review of Brochures found online quickly cloud this issue. Accuracy specifications are always stated as is the Precision specification, however, it quickly becomes a “gray area” is in the methods used to create the specification; How many measurements were taken? What volume were the measurements taken to produce the specification? Some Manufacturers will clearly define their testing process, while others provide a specification without a clear definition of how the values were derived.

To further cloud the issue (is this an intentional marketing ploy??) brochures will sometimes provide specifications as  for example; “Accuracy 0.0012 inches” while another system of the same type will provide the specification as for example;  “Accuracy +/- .0007”.  If your job is to evaluate the best system for your particular measurement task, you would quickly question is the 0.0012 inch specification Total error, Volumetric error or Maximum permissible Error, each type of error being unique. The second value provides more information however, without knowing how the tests were performed the questions still exist.

To further “muddy” the waters is the question of “What sigma is the specification?” I have seen the same company’s offer different sigma specifications for their equipment, again, this seems to be marketing. A one sigma specification is always going to “look” much better to the untrained eye, while a 2 sigma specification may look larger but as the trained Metrologist knows this value is a better specification as it is closer to “real life”. Using a one sigma specification means that 68% of the time I will achieve the stated accuracy and precision…..What about the other 32%???? Again, marketing plays a part in this when the brochure is created, what looks better on the spec sheet, that is what they prefer to use.  The technical people at those same company’s are never happy with this as they are left to explain to the customer AFTER the purchase what the specifications truly mean, this is NEVER a good conversation.

In the United States the National Institute of Standards and Traceability (NIST) assists in providing standards for the testing of our measurement equipment. Perhaps in the future with enough interest  from those in the industry one method of testing  can be developed and become the Standard, thus eliminating the simple questions; How Accurate is the Device? How Precise is the Device?

Should it really be that difficult for an experienced Dimensional Metrologist to determine which Device is going to provide the best measurement data for their measurement task?

Created by Pekaje, based on PNG version by Ant...
Created by Pekaje, based on PNG version by Anthony Cutler, using xfig, inkscape, and HTML Tidy (Photo credit: Wikipedia)

Blog at WordPress.com.

Up ↑

%d bloggers like this: