© SATRA Technology Centre. Reproduction is not permitted in any form without prior written permission from SATRA.
Uncertainty, ISO/IEC 17025:2017 and decisions made on conformity
How measurement uncertainty is accounted for when stating conformity with a specified requirement.
When ISO/IEC 17025 was republished in 2017, the new version included a requirement that laboratories take account of uncertainty of measurement when reporting results against a specification or requirement.
Clause 3.7 of ISO/IEC 17025:2017 defines a ‘decision rule’ as a rule that describes how measurement uncertainty is accounted for when stating conformity with a specified requirement. The decision rule applied must be identified in the statement of conformity of the test report unless it is inherent in the requested specification or standard.
This article provides information on how SATRA applies the decision rule to its test reports.
What is uncertainty?
The science of measurement is known as ‘metrology’. When anything is measured, the very act of measurement leads to a fluctuation in the result. For example, when taking a measurement, the resolution of the equipment will be a limit to accuracy. Therefore, a resolution of ‘0.1’ makes it impossible to measure a quantity that is actually ‘0.665’ – the instrument is only able to indicate ‘0.6’ or ‘0.7’. Thus, there is an uncertainty of measurement. The very act of handling a steel rule will heat it and cause it to expand in length, leading to small fluctuations – an ‘uncertainty’.
There are many examples where small variables and constants introduce variation to repeated measurements. The consequence is that when repeated measures are made, the results obtained form a group around the actual real value. This group can be considered as a ‘statistical population’. Most results of the measure will be close to the real value, but some will drift away. The group of repeated results will most often – but not always – follow a statistical curve (a distribution). This curve is a ‘confidence interval’, a statistical dispersion of the possible results – a measure of the probability that the result reported is close to the actual real value. The statement ‘k=2’ simply means that the possible statistical variation of the result (the uncertainty) has been considered two standard deviations either side of the mean value. Hence, with a result reported with k=2, there is a 95 per cent confidence of the result being within the given uncertainty magnitude of the real measured quantity. When comparing results that are close to requirements, competence is required in the application of the calculated confidence of the measured result.
Applying uncertainty when reporting results
SATRA guidelines – Where a result is reported against a SATRA guideline value, uncertainty of measurement has been taken into account when determining the guideline value itself. As such, uncertainty is not required to be considered when determining ‘pass/fail’ criteria.
Customer or other specifications – If a result is reported against a customer-supplied specification value, SATRA will have no information on whether or not uncertainty was taken into account when the requirement was determined. We will therefore not consider uncertainty when determining the pass/fail criteria, basing this instead on the raw data obtained. Information will, however, be provided where this is appropriate.
Qualitative results – Where the result is a simple ‘yes’ or ‘no’ to the presence of something, uncertainty is not applied.
Visual or subjective assessments – In cases where the result is a pass or fail against a visual or subjective assessment, uncertainty cannot be applied to the final result. In this instance, SATRA will apply uncertainty to the test itself on a worst-case basis before the assessment is made.
For instance, load, time and number of cycles of a test can be increased by a fixed number equal to the calculated uncertainty.
Where possible, the assessment of the test specimen will be made both after completion of the original uncorrected test and after uncertainty has been applied and results reported accordingly.
Numerical results – When reporting numerical results against a conformance statement, the effect of uncertainty of measurement on the result obtained is considered both positively and negatively, before a pass or failure or the allocation of a class or performance level is reported.
In most situations, the uncertainty of measurement is irrelevant to the interpretation of conformity, providing that the result obtained is not too close to the requirement. However, when the result lies close to a requirement limit, we are obliged to indicate that the uncertainty of the result may affect the interpretation of conformity.
When applying uncertainty, we use the statistical basis of k=2. This provides a coverage probability of approximately 95 per cent, which is also called the ‘expanded uncertainty’.
Where the result falls outside the ‘guard banding’ (the requirement plus or minus the uncertainty), the risk of the result being a false accept or false reject is minor, and in this case a pass or fail, class or level will be reported.
Where the result falls inside the guard banding, there is an increased risk that the result is a false accept or a false reject. In this instance, SATRA will not provide a pass or fail statement, or a class or level, but will include information in the notes in relation to the result obtained.
An example of this is if a requirement is stated as being a minimum of 100 units and the uncertainty of the test result is estimated to be ±5 units. In this situation, the ‘guard band’ (the requirement plus or minus the value of uncertainty associated with the test) is deemed as being 95 to 105 units. Any result falling within this range is considered as being ‘uncertain’, and statements such as ‘pass’ or ‘fail’ cannot be made with confidence (see figure 1).
With performance levels or classes, uncertainty is applied to the requirement of each level or class, with each having its own guard band.
How can we help?
15 PER CENT DISCOUNT ON FIRST SATRA TEST — please click here.
Please contact email@example.com for assistance with decision rules and ISO/IEC 17025:2017.