The Impact of OOT Calibration Weights
In regards to calibration weights used for internal balance checks, the impact of the OOT “As Found” results on the weight calibration certificate in most cases is surprisingly negligible.
The first thing that needs to be assessed is how far out of tolerance was the calibration weight and how does that value compare to the process tolerance and or readability of the respective balance. In most cases the tolerance for the calibration weights are very tight, Class 1, Class 0, Ultra Class, etc. Many times the OOT condition or amount that the calibration weight is out of tolerance would not be detected or measured on a particular balance.
For example, if we have a 100mg calibration weight that has a Class 1 tolerance of 0.010mg and was found to have an OOT value of -0.014mg from the weight calibration certificate. Then we know that the calibration weight was -0.004mg past the allowable Class 1 tolerance. If we were using this calibration weight on a Mettler Toledo XS205DU analytical balance, with a readability of 0.01mg at the 100mg range, then this -0.004mg value can not be measured or displayed. So in this case, there would be no impact of the OOT calibration weight.
The second issue that would need to be addressed in assessing the impact of OOT calibration weights is what is the internal balance check tolerance being applied. A 0.1% of the applied calibration weight is a very common industry tolerance in the pharmaceutical industry. Lets use the same data from the previous 100mg calibration weight example of the -0.014mg value from the weight calibration certificate but this time we will say the calibration weight is a Class 0 calibration weight with a tolerance of 0.005mg and the balance the weight is being used with is a Mettler Toledo XP6 micro balance with a readability of 0.001mg. The 100mg calibration weight was -0.009mg out of the Class 0 tolerance. This -0.009mg value can be measured and displayed on the micro balance, so this value now needs to be compared to the applied internal balance check tolerance (0.1% of the calibration weight), which in this case would be 0.100mg. The 100mg calibration weight’s OOT value of -0.009mg is well within the 0.100mg of the internal daily balance check, so the impact of the OOT calibration weight would be negligible.
The above two examples are that of metrological or scientific in nature, where the data is used to help determine the impact. It is also important to remember that the accuracy of the balance is not affected by the OOT calibration weights used in the respective internal balance check. The calibrations weights are used as independent checks and have no effect on the accuracy of the balance. So, just because calibration weights were found to be OOT from the weight calibration certificate results, this does not mean that the accuracy of any previous calibrations or measurements performed on the balance should be deemed inaccurate or questionable.
The one scenario in which the impact of the OOT calibration weights would be of concern is when the calibration weights have been used to “calibrate” the balance. Calibrate the balance would be defined as going into the balance’s menu program, as per the operating instructions, and adjusting the accuracy of the balance by following the calibration steps for that balance with the specified calibration weights. This is usually performed by an experienced balance calibration service technician.
Weight Checks for Laboratory Balances
The first thing that needs to be determined is what calibration weights are needed. This entails choosing the calibration weight class, weight range, and weight test points. Ideally, a weight should be more accurate than the weighing (test) instrument, though this is not always possible. Whenever possible, the calibration weight range should bracket the user range of the balance.
If an analytical balance that has a readability of 0.0001g (gram), and has a needed measurement range of 100g (grams) down to 100mg (milligrams), with a critical measurement point at 1g (gram), then the NIST Traceable calibration weights needed for this internal check would be at a minimum 100g, 1g, and 100mg. The accuracy for the calibration weights should be ASTM Class 1. ASTM Class 1 is a very precise calibration weight accuracy and the most common in domestic laboratory applications.
So, if we have those three weights incorporated into our internal check program, we have bracketed our measurement range for that particular analytical balance. In addition, with the very accurate ASTM Class 1 weights, we can feel very confident that when the weight is placed on the analytical balance and we see a very large error or discrepancy (Example: The 1g weight has a balance reading of 1.0030g) we know that there is an inaccuracy with the balance and it is time to contact the balance calibration vendor and more importantly that particular analytical balance needs to be put out of service until the inaccuracy is corrected.
The only time that ASTM Class 1 weights would not be sufficient would be in micro balance applications. Micro balances are extremely fine balances with readabilities of 0.000001g (gram) or even 0.0000001g (gram). For micro balance applications ASTM Class 0 or Ultra Class (Manufacturer Accuracy Class) weights would be needed. These are the most accurate weight classes and are better suited for the extremely fine readability of the micro balances.
The internal calibration weight balance check is not intended to duplicate the balance calibration performed by the balance calibration technicians during regularly scheduled preventative maintenances. It is a part of your quality assurance program to ensure that your weighing instruments are measuring accurately. The checks should be performed as per a standardized operating procedure (SOP) along with results being documented in the respective log book.
Also keep in mind, that the internal calibration weight checks alone are not sufficient. A regularly scheduled preventive maintenance and calibrations with a qualified vendor is still needed. And lastly, make sure your your internal calibration weights are calibrated on an appropriate interval (six months, annually, etc.) to ensure accuracy and NIST traceability.
For more info please see our “Internal Calibration Weight Verifications (Checks)” under Technical Articles in this site.
Figuring out the Accuracy Classes
This is a quick guide and background on weight accuracy classifications for your balance calibration weight (i.e. mass standards) to ensure traceable and accurate weighing measurement & calibrations.
When choosing balance calibration weights for your weighing and measurement application, the first thing that needs to be addressed is what accuracy class will be needed. Accuracy classifications are predetermined accuracy designations at the time of manufacturer. Currently there are three major categories for classifying precision laboratory weights; ASTM, OIML, and Manufacturing (Ultra Class and Ulti Class).
WEIGHT CLASS BACKGROUND
The ASTM Classification is as per the document ASTM E 617. Most of the domestic (United States) weight classifications adhere to this specification. The Weight Classes are broken out numerically from ASTM Class 0 to 7, starting with the most accurate classes first. ASTM Class 0 would be the most accurate and tightest allowable tolerance of the weight classifications followed by ASTM Class 1 as the next most accurate (tightest allowable tolerance). In most precision laboratory and calibration applications ASTM Class 4 weights would be as far down as you would probably want to go.
The OIML Classification is used internationally (Europe, Africa, Asia, South America, etc.). OIML R 111-1 is the document for the OIML weight classifications. The classifications are broken out as per alpha-numeric designations, Class E1, Class E2, Class F1, Class F2, Class M1, Class M2, and Class M3. OIML Class E1 would be the most accurate and tightest allowable tolerance followed by OIML Class E2 and OIML Class F1 respectively. For precision laboratory and calibration applications OIML Class F2 should be used as a minimum accuracy for the weights.
Manufacturing weight classifications, Ultra Class and Ulti Class, are for very precise laboratory applications. These weight classifications are only recognized by the end users and the manufacturers themselves. The weighing regulating bodies of ASTM, NIST, and OIML do not formally recognize Ultra Class or Ulti Class. The classifications and tolerance values are very similar to the ASTM Class 0 weight classification.
ACCURACY CLASS APPLICATIONS AND USAGE
There aren’t too many standards or references in terms of what weights to use and when.
Speaking in metrological terms, a practical guide would be the following:
ASTM Class 0, Ultra Class, and OIML Class E1 should be used as at the highest level of precision i.e. mass standards (calibrating other weights), micro-balances testing and calibration, and critical weighing applications.
ASTM Class 1, 2 and OIML Class E2, F1 should be used in precision applications i.e analytical balance testing and calibration.
ASTM Class 3, 4 and OIML Class F1, F2 are best suited for Top Loading Balance calibrations and testing, and moderate precision applications (laboratory non-critical).