WMO/GAW Glossary of QA/QC-Related Terminology

Version 1.0 2010-09-14 (last update: 2016-05-26 (minor changes, see Version history for details))


The evaluation and characterisation of data obtained from measurements made within WMO/GAW Programme involve a number of statistical parameters and specific terms to characterise data quality. At present, several of these terms (e.g. precision) are frequently used with different meaning by different people. Efforts for standardization have been made in the past, involving contributions from a number of international organizations, and are coordinated under the umbrella of →ISO.

With the aim of ensuring the comparability and compatibility of measurements, the GAW Strategic Plan [6] recommends adoption and use of internationally accepted methods and vocabulary to deal with measurement uncertainty as outlined in various ISO/BIPM publications [1-5]. Since each term should have the same meaning for all of its users, efforts are called for to familiarize all individuals involved in the WMO/GAW Programme and the associated scientific community with the relevant terminology. GAW members are strongly encouraged to use these terms in their own publications and to suggest their use when reviewing manuscripts of others.

Since the first edition (ver 0.4) of this glossary, the VIM [2] has been greatly expanded. The current 3rd edition of the VIM [1], entitled "International vocabulary of metrology — Basic and general concepts and associated terms (VIM)", is freely available through the Joint Committee for Guides in Metrology. It includes now a number of terms that were - to the knowledge of the editors - not previously defined in ISO/BIPM publications and that can now be referenced directly to this one authoritative source. As a result, the number of terms defined in the latest edition of the VIM is over-whelming at first encounter, and the editors have tried to select those that are deemed most relevant for WMO/GAW. Moreover, in some cases notes/and or examples given in the VIM [1] have been removed, while additional notes and/or examples have been added with respect to the specific requirements of the WMO/GAW Programme. If you find that terms are missing from this glossary, please contact one of the editors.

The 3rd edition of the VIM, while freely available on the internet, is copy-righted material. The editors of this WMO/GAW glossary are therefore grateful to BIPM and the JCGM for a generous handling of copyright matters and are pleased to insert the following acknowledgment: "Extracts from the International Vocabulary of Metrology – Basic and General Concepts and Associated Terms (VIM), 3rd edition, JCGM 200:2008 (→www.bipm.org/en/publications/guides/vim.html) are published with permission of the Director of the International Bureau of Weights and Measures (BIPM), in his function as Chairman of the JCGM."



Alphabetical list of terms

accuracy | adjustment of a measuring system | audit | calibration | calibration curve | calibration hierarchy | Central Calibration Laboratory (CCL) | certified reference material | combined standard measurement uncertainty | concentration | conventional quantity value | conventional reference scale | correction | coverage factor | coverage interval | coverage probability | data quality objectives (DQOs) | definitional uncertainty | expanded measurement uncertainty | indication | input quantity in a measurement model | international system of units | laboratory standard | measurand | measured quantity value | measurement | measurement accuracy | measurement bias | measurement error | measurement guideline (MG) | measuring instrument | measurement precision | measurement procedure | measurement repeatability | measurement reproducibility | measurement result | measurement trueness | measurement standard | measuring system | measurement uncertainty | metrological comparability of measurement results | metrological compatibility of measurement results | metrological traceability | metrological traceability chain | (mass) mixing ratio | (volume) mixing ratio | mole fraction | nominal quantity value | ordinal quantity | output quantity in a measurement model | precision | primary measurement standard | quality assurance | quality control | quantity | quantity value | random measurement error | reference material | reference measurement standard | reference quantity value | reference scale | repeatability condition of measurement | reproducibility condition of measurement | resolution | secondary measurement standard | sensitivity of a measuring system | selectivity of a measuring system | (measurement) standard | standard measurement uncertainty | standard operating procedure (SOP) | standard scale | surveillance cylinder | systematic measurement error | target cylinder (target gas) | tertiary standard | transfer measurement device | travelling measurement standard | true quantity value | Type A evaluation of measurement uncertainty | Type B evaluation of measurement uncertainty | World Calibration Centre (WCC) | working measurement standard | zero adjustment of a measuring system


SECTION 1 - Quantities and Units

[1.1] quantity #top#

property of a phenomenon, body, or substance, where the property has a magnitude that can be expressed as a number and a reference [1]

  1. The generic concept 'quantity' can be divided into several levels of specific concepts, e.g., 'length' can be specified as 'radius', or even more specifically, as 'radius of a circle'. The VIM [1] gives more detail and examples.
  1. Typical quantities include, e.g., the 'ozone mole fraction', the 'aerosol scattering coefficient', or 'down-welling longwave irradiance'.

[1.16] International System of Units #top#


system of units, based on the International System of Quantities, their names and symbols, including a series of prefixes and their names and symbols, together with rules for their use, adopted by the General Conference on Weights and Measures (CGPM) [1]

  1. The VIM [1] contains a description of the SI system and defines associated terms. For the sake of brevity, the reader is referred to the notes of sub-chapter 1.16 for more details.

[1.19] quantity value #top#

value of a quantity value

number and reference together expressing magnitude of a →quantity [1]

  1. Celcius temperature of a given sample: -5 °C [1].
  1. A quantity that cannot be expressed as a unit of measurement multiplied by a number may be expressed by reference to a →conventional reference scale or to a →measurement procedure or both [2].

[1.20] numerical quantity value #top#

numerical value of a quantity

numerical value

number in the expression of a →quantity value, other than any number serving as the reference [1]

  1. For quantities of dimension one, the reference is a measurement unit which is a number and this is not considered as a part of the numerical quantity value.
    - EXAMPLE: In an amount-of-substance fraction equal to 3 mmol/mol, the numerical quantity value is 3 and the unit is mmol/mol. The unit mmol/mol is numerically equal to 0.001, but this number 0.001 is not part of the numerical quantity value, which remains 3.
  2. For quantities that have a measurement unit (i.e. those other than ordinal quantities), the numerical value {Q} of a quantity Q is frequently denoted {Q}= Q/[Q], where [Q] denotes the measurement unit.
    - EXAMPLE: For a quantity value of 5.7 kg, the numerical quantity value is {m} = (5.7 kg)/kg = 5.7. The same quantity value can be expressed as 5 700 g in which case the numerical quantity value {m} = (5 700 g) / g = 5 700.

[1.26] ordinal quantity #top#

→quantity, defined by a conventional →measurement procedure, for which a total ordering relation can be established, according to magnitude, with other quantities of the same →kind, but for which no algebraic operations among those quantities exist [1]

  1. Ordinal quantities can enter into empirical relations only and have neither →measurement units nor →quantity dimensions. Differences and ratios of ordinal quantities have no physical meaning.

[1.29] conventional reference scale #top#

quantity-value scale defined by formal agreement[1]

  1. The scale is based upon a number of →primary measurement standards and a measurement procedure to interpolate other values.
  1. Within WMO/GAW, the conventional reference scale refers in particular to the calibration scale used within the WMO/GAW network. In the case of CO2, CH4 and N2O, this scale is implemented as a family of gas cylinders maintained at the →CCL (NOAA). For the other scales accepted in the WMO/GAW Programme please consult the GAW Strategic Plan or the GAW web site (Central Facilities)

[1.30] nominal property #top#

property of a phenomenon, body, or substance, where the property has no magnitude [1]

  1. Sex of a human being.
  2. Colour of a paint sample.
  3. Colour of a spot test in chemistry.
  4. ISO two-letter country code.
  5. Sequence of amino acids in a polypeptide.
  1. A nominal property has a value, which can be expressed in words, by alphanumerical codes, or by other means.
  2. 'Nominal property value' is not to be confused with →nominal quantity value.


SECTION 2 - Measurement

[2.1] measurement #top#

process of experimentally obtaining one or more →quantity values that can reasonably be attributed to a →quantity [1]

  1. Measurement does not apply to →nominal properties.
  2. Measurement implies comparison of quantities and includes counting of entities.
  3. Measurement presupposes a description of the quantity commensurate with the intended use of a →measurement result, a →measurement procedure, and a calibrated →measuring system operating according to the specified measurement procedure, including the measurement conditions.

[2.3] measurand #top#

→quantity intended to be measured [1]

  1. The specification of a measurand requires knowledge of the →kind of quantity, description of the state of the phenomenon, body, or substance carrying the quantity, including any relevant component, and the chemical entities involved.
  2. In the second edition of the VIM [1] and in IEC 60050-300:2001, the measurand is defined as the 'quantity subject to measurement'.
  3. The →measurement, including the →measuring system and the conditions under which the measurement is carried out, might change the phenomenon, body, or substance such that the quantity being measured may differ from the →measurand as defined. In this case, adequate correction is necessary.
    - EXAMPLE: The length of a steel rod in equilibrium with the ambient Celsius temperature of 23 °C will be different from the length at the specified temperature of 20 °C, which is the measurand. In this case, a correction is necessary.
    - EXAMPLE (GAW): The mole fraction of a specific trace will change with varying humidity.
  4. In chemistry, "analyte", or the name of a substance or compound, are terms sometimes used for 'measurand'. This usage is erroneous because these terms do not refer to quantities (e.g. ozone mixing ratio can be a measurand while ozone by itself cannot).

[2.6] measurement procedure #top#

detailed description of a measurement according to one or more measurement principles and to a given measurement method, based on a measurement model and including any calculation to obtain a measurement result [1]

  1. A measurement procedure is usually documented in sufficient detail to enable an operator to perform a measurement. [1].
  2. A measurement procedure can include a statement concerning a target measurement uncertainty.
  3. A measurement procedure is sometimes called a →standard operating procedure, abbreviated SOP.
  1. A measurement procedure can be replaced by a →measurement guideline (MG) if a measurement can be realized in several different ways.

[2.9] measurement result #top#

result of a measurement

set of →quantity values being attributed to a →measurand together with any other available relevant information [1]

  1. A measurement result generally contains "relevant information" about the set of quantity values, such that some may be more representative of the measurand than others. This may be expressed in the form of a probability density function (PDF).
  2. A measurement result is generally expressed as a single →measured quantity value and a →measurement uncertainty. If the measurement uncertainty is considered to be negligible for some purpose, the measurement result may be expressed as a single measured quantity value. In many fields, this is the common way of expressing a measurement result.
  3. In the traditional literature and in the previous edition of the VIM [2], measurement result was defined as a value attributed to a measurand and explained to mean an →indication, or an uncorrected result, or a corrected result, according to the context.
  1. The NO mole fraction is 2 nmol/mol.

[2.10] measured quantity value #top#

measured value of a quantity measured value →quantity value representing a →measurement result

  1. For a →measurement involving replicate →indications, each indication can be used to provide a corresponding measured quantity value. This set of individual measured quantity values can be used to calculate a resulting measured quantity value, such as an average or median, usually with a decreased associated →measurement uncertainty.
  2. When the range of the →true quantity values believed to represent the →measurand is small compared with the measurement uncertainty, a measured quantity value can be considered to be an estimate of an essentially unique true quantity value and is often an average or median of individual measured quantity values obtained through replicate measurements.
  3. In the case where the range of the true quantity values believed to represent the measurand is not small compared with the measurement uncertainty, a measured value is often an estimate of an average or median of the set of true quantity values.
  4. In the →GUM, the terms "result of measurement" and "estimate of the value of the measurand" or just "estimate of the measurand" are used for 'measured quantity value'.

[2.11] true quantity value #top#

true value of a quantity

true value

→quantity value consistent with the definition of a quantity [1]

  1. In the Error Approach to describing →measurement, a true quantity value is considered unique and, in practice, unknowable. The Uncertainty Approach is to recognize that, owing to the inherently incomplete amount of detail in the definition of a quantity, there is not a single true quantity value but rather a set of true quantity values consistent with the definition. However, this set of values is, in principle and in practice, unknowable. Other approaches dispense altogether with the concept of true quantity value and rely on the concept of →metrological compatibility of measurement results for assessing their validity.
  2. In the special case of a fundamental constant, the quantity is considered to have a single true quantity value.
  3. When the →definitional uncertainty associated with the →measurand is considered to be negligible compared to the other components of the →measurement uncertainty, the measurand may be considered to have an "essentially unique" true quantity value. This is the approach taken by the GUM and associated documents, where the word "true" is considered to be redundant.

[2.12] conventional quantity value #top#

conventional value of a quantity

conventional value

→quantity value attributed by agreement to a →quantity for a given purpose[1]

  1. The term "conventional true quantity value" is sometimes used for this concept, but its use is discouraged.
  2. Sometimes a conventional quantity value is an estimate of a true quantity value.
  3. A conventional quantity value is generally accepted as being associated with a suitably small measurement uncertainty, which might be zero.
  1. The previous VIM [2] defines a similar term called →assigned value. It is noted that this is still the preferred tern in the WMO/GAW Programme
  1. The CO2 standard in cylinder No. x has an assigned mole fraction of e.g. 381.00 µmol/mol.

[2.13] measurement accuracy #top#

accuracy of measurement


closeness of agreement between a →measured quantity value and a →true quantity value of a →measurand [1]

  1. The concept 'measurement accuracy' is not a →quantity and is not given a →numerical quantity value. A →measurement is said to be more accurate when it offers a smaller →measurement error.
  2. The term "measurement accuracy" should not be used for →measurement trueness and the term →measurement precision should not be used for 'measurement accuracy', which, however, is related to both these concepts.
  3. 'Measurement accuracy' is sometimes understood as closeness of agreement between measured quantity values that are being attributed to the measurand.
  1. The term "accuracy", when applied to a set of measurement results, involves a combination of random concepts and a common systematic error or bias component.
  1. CO mole fraction measurements in the atmosphere with instrument A have higher accuracy (e.g. measurement error = 2 nmol/mol) than with instrument B (e.g. measurement error = 3 nmol/mol).

[2.14] measurement trueness #top#

trueness of measurement


closeness of agreement between the average of an infinite number of replicate →measured quantity values and a →reference quantity value

  1. Measurement trueness is not a →quantity and thus cannot be expressed numerically, but measures for closeness of agreement are given in ISO 5725 [9].
  2. Measurement trueness is inversely related to →systematic measurement error, but is not related to →random measurement error.
  3. →Measurement accuracy should not be used for 'measurement trueness' and vice versa.
  1. While the terminology used in connection with trueness is very similar to that used for accuracy, trueness applies to the average value of a large number of measurements.
  2. Measurement trueness is an important concept for comparisons, e.g. to determine if measurements at different sites are on the same scale.

[2.15] measurement precision #top#


closeness of agreement between →indications or →measured quantity values obtained by replicate →measurements on the same or similar objects under specified conditions [1].

  1. Measurement precision is usually expressed numerically by measures of imprecision, such as standard deviation, variance, or coefficient of variation under the specified conditions of measurement.
  2. The 'specified conditions' can be, for example, →repeatability conditions of measurement, intermediate precision conditions of measurement, or →reproducibility conditions of measurement (see ISO 5725-3:1994).
  3. Measurement precision is used to define →measurement repeatability, intermediate measurement precision, and →measurement reproducibility.
  4. Sometimes "measurement precision" is erroneously used to mean →accuracy.
  5. Measurement precision is a measure of the dispersion of values.
  1. Precision depends only on the distribution of random errors and does not relate to the "true" value or to the specified value.
  1. The standard deviation of a set of values obtained in a finite number of analyses of the same sample (e.g. CH4 dry air mole fraction) amounts to 2 nmol/mol, which can mean a good precision of the particular instrument.

[2.16] measurement error #top#

error of measurement


→measured quantity value minus a →reference quantity value [1]

  1. The concept of 'measurement error' can be used both
    a) when there is a single reference quantity value to refer to, which occurs if a →calibration is made by means of a →measurement standard with a →measured quantity value having a negligible →measurement uncertainty or if a →conventional quantity value is given, in which case the measurement error is known, and
    b) if a →measurand is supposed to be represented by a unique →true quantity value or a set of true quantity values of negligible range, in which case the measurement error is not known.
  2. Measurement error should not be confused with production error or mistake.
  1. Analysis of the standard gas mixture results in a CH4 mole fraction of 1847 nmol/mol instead of 1850 nmol/mol indicated on the target cylinder, which means a measurement error of -3 nmol/mol.

[2.17] systematic measurement error #top#

systematic error of measurement

systematic error

component of →measurement error that in replicate →measurements remains constant or varies in a predictable manner [1]

  1. A →reference quantity value for a systematic measurement error is a →true quantity value, or a →measured quantity value of a →measurement standard of negligible →measurement uncertainty, or a →conventional quantity value.
  2. Systematic measurement error, and its causes, can be known or unknown. A →correction can be applied for a known systematic measurement error. [1]
  3. Systematic measurement error equals measurement error minus →random measurement error [1].
  4. Systematic error may be constant or may depend on the value of the measurand.
  5. For a measuring instrument, see also →bias.
  1. 3 nmol/mol in the previous example in →[2.16] can comprise xx nmol/mol of systematic error. This systematic error might e.g. be related to a wrong value assigned to the standard used for the calibration during the measurements.

[2.18] measurement bias #top#


estimate of a →systematic measurement error [1]

  1. The bias of a measuring instrument is normally estimated by averaging the error of indication over an appropriate number of measurements.
  2. In other words: bias is the difference between the average value of a large series of measurements and the accepted ("true") value. Bias is equivalent to the total systematic error in a measurement.
  3. A correction to negate the systematic error can be made by adjusting for the bias.

[2.19] random measurement error #top#

random error of measurement

random error

component of measurement error that in replicate measurements varies in an unpredictable manner [1]

  1. A →reference quantity value for a random measurement error is the average that would ensue from an infinite number of replicate measurements of the same →measurand.
  2. Random measurement errors of a set of replicate measurements form a distribution that can be summarized by its expectation, which is generally assumed to be zero, and its variance.
  3. Random measurement error equals measurement error minus →systematic measurement error.
  4. Because only a finite number of measurements can be made, it is only possible to determine an estimate of random error.

[2.20] repeatability condition of measurement #top#

repeatability condition

condition of →measurement, out of a set of conditions that includes the same →measurement procedure, same operators, same →measuring system, same operating conditions and same location, and replicate measurements on the same or similar objects over a short period of time [1]

  1. A condition of measurement is a repeatability condition only with respect to a specified set of repeatability conditions.
  2. In chemistry, the term "intra-serial precision condition of measurement" is sometimes used to designate this concept.

[2.21] measurement repeatability #top#


→measurement precision under a set of →repeatability conditions of measurement [1]

[2.24] reproducibility condition of measurement #top#

reproducibility condition

condition of →measurement, out of a set of conditions that includes different locations, operators, →measuring systems, and replicate measurements on the same or similar objects [1]

  1. The different measuring systems may use different →measurement procedures.
  2. A specification should give the conditions changed and unchanged, to the extent practical.
  1. WMO/GAW round-robin experiments.

[2.25] measurement reproducibility #top#


→measurement precision under →reproducibility conditions of measurement [1]

  1. Relevant statistical terms are given in ISO 5725-1:1994 and ISO 5725-2:1994.
  2. see NOTES under →reproducibility condition of measurement
  1. ISO 5725-1:1994 and ISO 5725-2:1994 describe the general principles and basic method for the determination of repeatability and reproducibility of a standard measurement method, respectively.

[2.26] measurement uncertainty #top#

uncertainty of measurement


non-negative parameter characterising the dispersion of the →quantity values being attributed to a →measurand, based on the information used [1]

  1. Measurement uncertainty includes components arising from systematic effects, such as components associated with →corrections and the assigned quantity values of →measurement standards, as well as the →definitional uncertainty. Sometimes estimated systematic effects are not corrected for but, instead, associated measurement uncertainty components are incorporated.
  2. The parameter may be, for example, a standard deviation called →standard measurement uncertainty (or a specified multiple of it), or the half-width of an interval, having a stated →coverage probability.
  3. Measurement uncertainty comprises, in general, many components. Some of these may be evaluated by →Type A evaluation of measurement uncertainty from the statistical distribution of the quantity values from series of →measurements and can be characterized by standard deviations. The other components, which may be evaluated by →Type B evaluation of measurement uncertainty, can also be characterized by standard deviations, evaluated from probability density functions based on experience or other information.
  4. In general, for a given set of information, it is understood that the measurement uncertainty is associated with a stated quantity value attributed to the measurand. A modification of this value results in a modification of the associated uncertainty.
  5. The concept of "uncertainty" is explained in detail in GUM [4]. In practice the term "→error (measurement error) seems to be often used when actually "→uncertainty" is meant. An error is viewed as having two components, a random and a systematic component [4]. As further stated in this reference, "error" is an idealised concept and errors cannot be known exactly. "Error" and "uncertainty" are not synonyms, but represent completely different concepts.

[2.27] definitional uncertainty #top#

component of →measurement uncertainty resulting from the finite amount of detail in the definition of a →measurand [1]

  1. Definitional uncertainty is the practical minimum measurement uncertainty achievable in any →measurement of a given measurand.
  2. Any change in the descriptive detail leads to another definitional uncertainty.
  3. In the ISO/IEC Guide 98-3:2008 [4], D.3.4, and in IEC 60359, the concept 'definitional uncertainty' is termed "intrinsic uncertainty".
  1. A pertinent example would be the 'mole fraction of CO2'. This is a commonly specified measurand, and many reference standards specify, e.g., a certain mole fraction of CO2 in natural air. The definitional uncertainty relates to the lack of specification of the isotopic composition. Consequently, any measurement of the 'mole fraction of CO2' in an air sample that is calibrated based on such a reference standard will include an uncertainty due to the potentially different sensitivity of the measuring instrument to different isotopes of C and O in CO2. This will be the more relevant, the more the isotopic compositions between standard and sample differ, and the more the instrument is particularly sensitive to specific isotopes.

[2.28] Type A evaluation of measurement uncertainty #top#

Type A evaluation

evaluation of a component of →measurement uncertainty by a statistical analysis of →measured quantity values obtained under defined measurement conditions [1]

  1. For various types of measurement conditions, see →repeatability condition of measurement, intermediate precision condition of measurement, and →reproducibility condition of measurement.
  2. For information about statistical analysis, see e.g. ISO/IEC Guide 98-3 [4].
  3. See also ISO/IEC Guide 98-3:2008 [4], 2.3.2, ISO 5725, ISO 13528, ISO/TS 21748, ISO/TS 21749.

[2.29] Type B evaluation of measurement uncertainty #top#

Type B evaluation

evaluation of a component of →measurement uncertainty determined by means other than a →Type A evaluation of measurement uncertainty [1]

  1. See also ISO/IEC Guide 98-3:2008 [4], 2.3.3.
  1. Drift would typically refer to instrument drift, either oscillating as a result of, e.g., diurnal temperature variations, or unidirectional, e.g., as a result of aging of a sensor.

[2.30] standard measurement uncertainty #top#

standard uncertainty of measurement

standard uncertainty

→measurement uncertainty expressed as a standard deviation [1]

[2.31] combined standard measurement uncertainty #top#

combined standard uncertainty

→standard measurement uncertainty that is obtained using the individual →standard measurement uncertainties associated with the →input quantities in a measurement model

  1. In case of correlations of input quantities in a measurement model, covariances must also be taken into account when calculating the combined standard measurement uncertainty; see also ISO/IEC Guide 98-3:2008 [4], 2.3.4.

[2.35] expanded measurement uncertainty #top#

expanded uncertainty

quantity defining an interval about the result of a measurement that may be expected to encompass a large fraction of the distribution of values that could reasonably be attributed to the →measurand [4], given by the product of a →combined standard measurement uncertainty and a factor larger than the number one [1]

  1. The factor depends upon the type of probability distribution of the →output quantity in a measurement model and on the selected →coverage probability. [1]
  2. The term "factor" in this definition refers to a →coverage factor. [1]
  3. Expanded measurement uncertainty is termed "overall uncertainty" in paragraph 5 of Recommendation INC-1 (1980) (see the GUM) and simply "uncertainty" in IEC documents. [1]

[2.36] coverage interval #top#

interval containing the set of →true quantity values of a →measurand with a stated probability, based on the information available [1]

  1. A coverage interval does not need to be centred on the chosen →measured quantity value (see ISO/IEC Guide 98-3:2008/Suppl.1 [4]).
  2. A coverage interval should not be termed "confidence interval" to avoid confusion with the statistical concept (see ISO/IEC Guide 98-3:2008, 6.2.2 [4]).
  3. A coverage interval can be derived from an →expanded measurement uncertainty (see ISO/IEC Guide 98-3:2008, 2.3.5 [4]).

[2.37] coverage probability #top#

probability that the set of →true quantity values of a →measurand is contained within a specified →coverage interval [1]

  1. This definition pertains to the Uncertainty Approach as presented in the GUM [4].
  2. The coverage probability is also termed "level of confidence" in the GUM [4].

[2.38] coverage factor #top#

number larger than one by which a →combined standard measurement uncertainty is multiplied to obtain an →expanded measurement uncertainty [1]

  1. A coverage factor is usually symbolized k (see also ISO/IEC Guide 98-3:2008, 2.3.6 [4]).
  2. Coverage factors are typically in the range 2 to 3.

[2.39] calibration #top#

operation that, under specified conditions, in a first step, establishes a relation between the →quantity values with →measurement uncertainties provided by measurement standards and corresponding →indications with associated measurement uncertainties and, in a second step, uses this information to establish a relation for obtaining a →measurement result from an indication [1]

  1. A calibration may be expressed by a statement, calibration function, →calibration diagram, →calibration curve, or calibration table. In some cases, it may consist of an additive or multiplicative →correction of the indication with associated measurement uncertainty. [1]
  2. Calibration should not be confused with →adjustment of a measuring system, often mistakenly called "self-calibration", nor with verification of calibration.
  3. Often, the first step alone in the above definition is perceived as being calibration.

[2.40] calibration hierarchy #top#

sequence of →calibrations from a reference to the final →measuring system, where the outcome of each calibration depends on the outcome of the previous calibration [1]

  1. →Measurement uncertainty necessarily increases along the sequence of calibrations.
  2. The elements of a calibration hierarchy are one or more →measurement standards and measuring systems operated according to →measurement procedures.
  3. For this definition, the 'reference' can be a definition of a →measurement unit through its practical realization, or a measurement procedure, or a measurement standard.
  4. A comparison between two measurement standards may be viewed as a calibration if the comparison is used to check and, if necessary, correct the →quantity value and measurement uncertainty attributed to one of the measurement standards.

[2.41] metrological traceability #top#

property of a →measurement result whereby the result can be related to a reference through a documented unbroken chain of →calibrations, each contributing to the →measurement uncertainty [1]

  1. For this definition, a 'reference' can be a definition of a →measurement unit through its practical realization, or a →measurement procedure including the measurement unit for a →non-ordinal quantity, or a →measurement standard.
  2. Metrological traceability requires an established →calibration hierarchy.
  3. Specification of the reference must include the time at which this reference was used in establishing the calibration hierarchy, along with any other relevant metrological information about the reference, such as when the first calibration in the calibration hierarchy was performed.
  4. For →measurements with more than one →input quantity in the measurement model, each of the input →quantity values should itself be metrologically traceable and the calibration hierarchy involved may form a branched structure or a network. The effort involved in establishing metrological traceability for each input quantity value should be commensurate with its relative contribution to the measurement result.
  5. Metrological traceability of a measurement result does not ensure that the measurement uncertainty is adequate for a given purpose or that there is an absence of mistakes.
  6. A comparison between two measurement standards may be viewed as a calibration if the comparison is used to check and, if necessary, correct the quantity value and measurement uncertainty attributed to one of the measurement standards.
  7. The ILAC considers the elements for confirming metrological traceability to be an unbroken metrological traceability chain to an →international measurement standard or a →national measurement standard, a documented measurement uncertainty, a documented measurement procedure, accredited technical competence, metrological traceability to the →SI, and calibration intervals (see ILAC P-10:2002).
  8. The abbreviated term "traceability" is sometimes used to mean 'metrological traceability' as well as other concepts, such as 'sample traceability' or 'document traceability' or 'instrument traceability' or 'material traceability', where the history ("trace") of an item is meant. Therefore, the full term of "metrological traceability" is preferred if there is any risk of confusion.
  1. To minimize the accumulation of measurement uncertainty, institutes should maintain as direct as possible a path between their laboratory standards and the →CCL.

[2.42] metrological traceability chain #top#

traceability chain

sequence of →measurement standards and →calibrations that is used to relate a →measurement result to a reference [1]

  1. A metrological traceability chain is defined through a →calibration hierarchy.
  2. A metrological traceability chain is used to establish →metrological traceability of a measurement result.
  3. A comparison between two measurement standards may be viewed as a calibration if the comparison is used to check and, if necessary, correct the →quantity value and →measurement uncertainty attributed to one of the measurement standards.

[2.46] metrological comparability of measurement results #top#

metrological comparability

comparability of →measurement results, for →quantities of a given →kind, that are metrologically traceable to the same reference[1]

  1. See Note 1 to 2.41 →metrological traceability
  2. Metrological comparability of measurement results does not necessitate that the →measured quantity values and associated →measurement uncertainties compared be of the same order of magnitude.
  1. Metrological comparability requires that quantities are presented in the same units.

[2.47] metrological compatibility of measurement results #top#

metrological compatibility

property of a set of →measurement results for a specified →measurand, such that the absolute value of the difference of any pair of →measured quantity values from two different measurement results is smaller than some chosen multiple of the →standard measurement uncertainty of that difference [1]

  1. Metrological compatibility of measurement results replaces the traditional concept of 'staying within the error', as it represents the criterion for deciding whether two measurement results refer to the same measurand or not. If in a set of →measurements of a measurand, thought to be constant, a measurement result is not compatible with the others, either the measurement was not correct (e.g. its →measurement uncertainty was assessed as being too small) or the measured →quantity changed between measurements.
  2. Correlation between the measurements influences metrological compatibility of measurement results. If the measurements are completely uncorrelated, the standard measurement uncertainty of their difference is equal to the root mean square sum of their standard measurement uncertainties, while it is lower for positive covariance or higher for negative covariance.
  1. For the WMO/GAW network, this refers to the dispersion of measurements of the same standard by different laboratories (within network comparability).
  2. The term may also be used with reference to measurements by different laboratories in different places.
  3. Moreover, the term may be used to describe the difference between a measurement of a species in a discrete sample and an averaged continuous measurement for a period that includes the time in which the discrete sample was collected.
  4. In the case of significantly different variances of two sample sets, the mean difference may not be meaningful. The Wilcoxon-Mann-Whitney test can be used to test for statistical significance.

[2.50] input quantity in a measurement model #top#

input quantity

→quantity that must be measured, or a quantity, the →value of which can be otherwise obtained, in order to calculate a →measured quantity value of a →measurand [1]

  1. When the length of a steel rod at a specified temperature is the measurand, the actual temperature, the length at that actual temperature, and the linear thermal expansion coefficient of the rod are input quantities in a measurement model.
  1. An input quantity in a measurement model is often an output quantity of a →measuring system.
  2. →Indications, →corrections and influence →quantities can be input quantities in a measurement model.
  1. Instruments measuring the ozone mole fraction based on the UV absorption technique usually correct for temperature and pressure changes in the measurement cell. Thus, absorbed light intensity, temperature and pressure in the cell are used as input quantitites to obtain ozone mole fractions.

[2.51] output quantity in a measurement model #top#

output quantity

→quantity, the →measured value of which is calculated using the →values of →input quantities in a measurement model [1]

[2.53] correction #top#

compensation for an estimated systematic effect [1]

  1. See ISO/IEC Guide 98-3:2008 [4], 3.2.3, for an explanation of 'systematic effect'
  2. The compensation can take different forms, such as an addend or a factor, or can be deduced from a table.


SECTION 3 - Devices for Measurement

[3.1] measuring instrument #top#

device used for making →measurements, alone or in conjunction with one or more supplementary devices [1]

  1. A measuring instrument that can be used alone is a →measuring system.

[3.2] measuring system #top#

set of one or more →measuring instruments and often other devices, including any reagent and supply, assembled and adapted to give information used to generate →measured quantity values within specified intervals for →quantities of specified kinds [1]

  1. A measuring system may consist of only one measuring instrument.

[3.11] adjustment of a measuring system #top#


set of operations carried out on a →measuring system so that it provides prescribed →indications corresponding to given →values of a →quantity to be measured [1]

  1. Types of adjustment of a measuring system include →zero adjustment of a measuring system, offset adjustment, and span adjustment (sometimes called gain adjustment).
  2. Adjustment of a measuring system should not be confused with →calibration, which is a prerequisite for adjustment.
  3. After an adjustment of a measuring system, the measuring system must usually be recalibrated.

[3.12] zero adjustment of a measuring system #top#

zero adjustment

→adjustment of a measuring system so that it provides a null →indication corresponding to a zero →value of a →quantity to be measured [1]


SECTION 4 - Properties of Measuring Devices

[4.1] indication #top#

quantity value provided by a →measuring instrument or a →measuring system [1]

  1. An indication may be presented in visual or acoustic form or may be transferred to another device. An indication is often given by the position of a pointer on the display for analog outputs, a displayed or printed number for digital outputs, a code pattern for code outputs, or an assigned quantity value for material measures.
  2. An indication and a corresponding value of the →quantity being measured are not necessarily values of quantities of the same kind.

[4.6] nominal quantity value #top#

nominal value

rounded or approximate →value of a characterizing →quantity of a →measuring instrument or →measuring system that provides guidance for its appropriate use [1]

  1. 1000 ml as the nominal quantity value marked on a single-mark volumetric flask.
  1. "Nominal quantity value" and "nominal value" are not to be confused with →"nominal property value" (see 1.30, Note 2).
  1. Minimum detection limit or sampling time can be examples of the nominal values of the measuring instrument.

[4.12] sensitivity of a measuring system #top#


quotient of the change in an →indication of a →measuring system and the corresponding change in a →value of a →quantity being measured [1]

  1. Sensitivity of a measuring system can depend on the value of the quantity being measured.
  2. The change considered in a value of a quantity being measured must be large compared with the →resolution.

[4.13] selectivity of a measuring system #top#


property of a →measuring system, used with a specified →measurement procedure, whereby it provides measured →quantity values for one or more →measurands such that the values of each measurand are independent of other measurands or other →quantities in the phenomenon, body, or substance being investigated [1]

  1. Capability of a measuring system including a mass spectrometer to measure the ion current ratio generated by two specified compounds without disturbance by other specified sources of electric current.
  2. Capability of a measuring system to measure the power of a signal component at a given frequency without being disturbed by signal components or other signals at other frequencies.
  3. Capability of a measuring system for ionizing radiation to respond to a given radiation to be measured in the presence of concomitant radiation.
  4. Capability of a mass spectrometer to measure the amount-of-substance abundance of the 28Si isotope and of the 30Si isotope in silicon from a geological deposit without influence between the two, or from the 29Si isotope.
  1. In physics, there is only one measurand; the other quantities are of the same kind as the measurand, and they are input quantities to the measuring system.
  2. In chemistry, the measured quantities often involve different components in the system undergoing measurement and these quantities are not necessarily of the same kind.
  3. In chemistry, selectivity of a measuring system is usually obtained for quantities with selected components in concentrations within stated intervals.
  4. Selectivity as used in physics (see Note 1) is a concept close to specificity as it is sometimes used in chemistry.
  1. A measuring system is highly selective for a specific trace gas (e.g. CO) mole fraction if its quantity value (e.g. 2 nmol/mol) is independent on quantity changes of the sample matrix (e.g. humidity changes).
  2. Capability of gas chromatographic system to fully separate the peak of substance A from substance B in a chromatogram.

[4.14] resolution #top#

smallest change in a →quantity being measured that causes a perceptible change in the corresponding →indication [1]

  1. Resolution can depend on, for example, noise (internal or external) or friction. It may also depend on the →value of a quantity being measured.

[4.31] calibration curve #top#

expression of the relation between →indication and corresponding →measured quantity value [1]

  1. A calibration curve expresses a one-to-one relation that does not supply a →measurement result as it bears no information about the →measurement uncertainty


SECTION 5 - Measurement Standards

[5.1] measurement standard #top#


realization of the definition of a given quantity, with stated quantity value and associated measurement uncertainty, used as a reference [1]. See →standard for a related definition.

  1. 1 kg mass measurement standard with an associated →standard measurement uncertainty of 3 µg.
  2. 100 Ohm measurement standard resistor with an associated standard measurement uncertainty of 1 µOhm.
  3. Hydrogen reference electrode with an assigned quantity value of 7.072 and an associated standard measurement uncertainty of 0.006.
  1. A "realization of the definition of a given quantity" can be provided by a →measuring system, a →material measure, or a reference material.
  2. A measurement standard is frequently used as a reference in establishing →measured quantity values and associated measurement uncertainties for other quantities of the same →kind, thereby establishing →metrological traceability through →calibration of other measurement standards, →measuring instruments, or measuring systems.
  3. The term "realization" is used here in the most general meaning. It denotes three procedures of "realization". The first one consists in the physical realization of the measurement unit from its definition and is realization sensu stricto. The second, termed "reproduction", consists not in realizing the →measurement unit from its definition but in setting up a highly reproducible measurement standard based on a physical phenomenon, as it happens, e.g. in case of use of frequency-stabilized lasers to establish a measurement standard for the metre, of the Josephson effect for the volt or of the quantum Hall effect for the ohm. The third procedure consists in adopting a material measure as a measurement standard. It occurs in the case of the measurement standard of 1 kg.
  4. A standard measurement uncertainty associated with a measurement standard is always a component of the →combined standard measurement uncertainty (see ISO/IEC Guide 98-3:2008 [4], 2.3.4) in a measurement result
  5. Quantity value and measurement uncertainty must be determined at the time when the measurement standard is used.
  6. Several quantities of the same kind or of different kinds may be realized in one device which is commonly also called a measurement standard.
  7. The word "embodiment" is sometimes used in the English language instead of "realization".
  8. In science and technology, the English word "standard" is used with at least two different meanings : as a specification, technical recommendation, or similar normative document (in French « norme ») and as a measurement standard (in French « étalon »). This Vocabulary is concerned solely with the second meaning.

[5.4] primary measurement standard #top#

primary standard

→measurement standard established using a →primary reference measurement procedure, or created as an artifact, chosen by convention [1]

  1. Primary measurement standard of amountof- substance concentration prepared by dissolving a known amount of substance of a chemical component to a known volume of solution.
  2. Primary measurement standard for pressure based on separate →measurements of force and area.
  3. Primary measurement standard for isotope amount-of-substance ratio measurements, prepared by mixing known amount-of-substances of specified isotopes.
  1. In particular with respect to trace gases, standard with assigned mole fraction based on absolute →calibration, i.e. gravimetric or equivalent method.
  2. For example, within WMO/GAW, the primary standards for the trace gases CO2, CH4, CO, and N2O are maintained at NOAA. The complete list of →CCLs can be found at the GAW web site (Quality Assurance).

[5.5] secondary measurement standard #top#

secondary standard

→measurement standard established through →calibration with respect to a →primary measurement standard for a →quantity of the same →kind [1].

  1. Calibration may be obtained directly between a primary measurement standard and a secondary measurement standard, or involve an intermediate →measuring system calibrated by the primary measurement standard and assigning a →measurement result to the secondary measurement standard.
  2. A measurement standard having its →quantity value assigned by a ratio →primary reference measurement procedure is a secondary measurement standard.
  1. For trace gas measurements within WMO/GAW, this refers to a standard (natural air or synthetic gas mixture) with mole fractions for target species that are obtained from comparisons made by the →Central Calibration Laboratory with primary standards kept at its laboratory.

[5.6] reference measurement standard #top#

reference standard

→measurement standard designated for the →calibration of other measurement standards for →quantities of a given →kind in a given organization or at a given location [1]

  1. The term 'reference standard' was used in the WMO/GAW Reports 142 [6] and 156 to indicate the WMO/GAW primary standards or the organisation that maintains them. Since then, the term '→primary standard' has been preferred, in keeping with the ISO definition.

[5.7] working measurement standard #top#

working standard

→measurement standard that is used routinely to calibrate or verify →measuring instruments or →measuring systems [1]

  1. A working measurement standard is usually calibrated with respect to a reference measurement standard [1].
  2. In relation to verification, the terms "check standard" or "control standard" are also sometimes used [1].
  1. For stable gases, any gas (natural air or synthetic gas mixture) with assigned mole fractions of one or more trace species obtained from comparisons with the laboratory standard(s) of an individual laboratory or station, or from comparisons with transfer standards provided by another laboratory, such as the →WCC.
  2. For measurement of stable trace gases, usually gas cylinders denoted as working standards are employed as calibration cylinders for routine measurements.
  3. For other parameters measured within GAW, the traceability chain might be different from the one for trace gases.

[5.8] travelling measurement standard #top#

travelling standard

→measurement standard, sometimes of special construction, intended for transport between different locations [1]

  1. For measurements of stable trace gases within WMO/GAW, this refers in particular to compressed gas cylinders (natural air or synthetic gas mixture) for use at different locations with an assigned mole fraction of one or more trace species resulting from calibration(s) by the →CCL or from comparisons with laboratory standards by an approved laboratory, such as the →WCC.
  2. Other components within the WMO/GAW programme might require the use of a special instrument designated as travelling standard, e.g., ozone calibrator for surface ozone measurements.

[5.9] transfer measurement device #top#

transfer device

device used as an intermediary to compare →measurement standards [1]

  1. Sometimes, measurement standards are used as transfer devices.
  1. For trace gas measurements within WMO/GAW, this refers in particular to compressed gas cylinders (natural air or synthetic gas mixture) for use at different locations with an assigned mole fraction of one or more trace species resulting from calibration(s) by the →CCL or from comparisons with laboratory standards by an approved laboratory, such as the →WCC.
  2. The term transfer standard is often used in the same sense as →travelling measurement standard.
  3. A typical example for a transfer device would be a portable analyzer used to compare laboratory standards with travelling standards during an audit. Incidentally, the station analyzer itself often serves this purpose.

[5.13] reference material #top#


material, sufficiently homogeneous and stable with reference to specified properties, which has been established to be fit for its intended use in →measurement or in examination of →nominal properties [1]

  1. Examination of a nominal property provides a nominal property value and associated uncertainty. This uncertainty is not a →measurement uncertainty.
  2. Reference materials with or without assigned →quantity values can be used for →measurement precision control whereas only reference materials with assigned quantity values can be used for →calibration or →measurement trueness control.
  3. 'Reference material' comprises materials embodying →quantities as well as →nominal properties.
    - EXAMPLE 1: Examples of reference materials embodying quantities:
    a) water of stated purity, the dynamic viscosity of which is used to calibrate viscometers;
    - EXAMPLE 2: Examples of reference materials embodying nominal properties:
    a) colour chart indicating one or more specified colours;
  4. A reference material is sometimes incorporated into a specially fabricated device.
    - EXAMPLE 1: Substance of known triple-point in a triple-point cell.
    - EXAMPLE 2: Glass of known optical density in a transmission filter holder.
  5. In a given →measurement, a given reference material can only be used for either calibration or quality assurance.
  6. The specifications of a reference material should include its material traceability, indicating its origin and processing (Accred. Qual. Assur.: 2006) [45].
  7. ISO/REMCO has an analogous definition[45] but uses the term "measurement process" to mean 'examination' (ISO 15189:2007, 3.4), which covers both measurement of a quantity and examination of a nominal property.

[5.14] c e r t i f i e d reference material #top#

→reference material, accompanied by documentation issued by an authoritative body and providing one or more specified property values with associated uncertainties and traceabilities, using valid procedures [1]

  1. 'Documentation' is given in the form of a 'certificate' (see ISO Guide 31:2000).
  2. Procedures for the production and certification of certified reference materials are given, e.g. in ISO Guide 34 and ISO Guide 35.
  3. In this definition, "uncertainty" covers both 'measurement uncertainty' and 'uncertainty associated with the value of a →nominal property', such as for identity and sequence. "Traceability" covers both '→metrological traceability of a quantity value' and 'traceability of a nominal property value'.
  4. Specified quantity values of certified reference materials require metrological traceability with associated measurement uncertainty (Accred. Qual. Assur.: 2006) [45].
  5. ISO/REMCO has an analogous definition (Accred. Qual. Assur.: 2006) [45] but uses the modifiers 'metrological' and 'metrologically' to refer to both quantity and nominal property.

[5.18] reference quantity value #top#

reference value

→quantity value used as a basis for comparison with values of →quantities of the same →kind [1]

  1. A reference quantity value can be a →true quantity value of a →measurand, in which case it is unknown, or a →conventional quantity value, in which case it is known.
  2. A reference quantity value with associated →measurement uncertainty is usually provided with reference to
    1. a material, e.g. a →certified reference material,
    2. a device, e.g. a stabilized laser,
    3. a →reference measurement procedure,
    4. a comparison of →measurement standards.



assigned value (of a quantity) #top#

synonym for conventional true value [2]


  1. The term 'conventional true value' is no longer defined in the VIM [1], which distinguishes between →conventional value and →true value.

audit #top#

  1. Performance audit: Voluntary check for conformity of a measurement where the audit criteria are the →data quality objectives (DQOs) for the specific parameter. In the absence of formal DQOs, an audit will at least involve ensuring the traceability of measurements to the Reference Standard [8].
  2. System audit: More generally defined as a check of the overall conformity of a station with the principles of the GAW system [8].

Central Calibration Laboratory (CCL) #top#

within the WMO/GAW network, laboratory responsible for maintaining the →standard scale for the species under consideration

concentration #top#

Mass of the particular component (gas) per unit volume of air (µg/m3). See also →recommendations below.

data quality objectives (DQOs) #top#

qualitative and quantitative statements that clarify the objectives of observations, define the appropriate type of data, and specify tolerable levels of →uncertainty. DQOs will be used as the basis for establishing the quality and quantity of data needed to support decisions (adapted from [7]).

  1. Decisions in this context include scientific decisions (e.g. significance testing of trends) as well as decisions of political or societal dimension.

laboratory standard #top#

standard of highest rank at an individual laboratory or station traceable to the WMO/GAW →standard scale

measurement guideline (MG) #top#

written instruction that provides basic information on various issues related to the measurement of a specific quantity. It usually covers major aspects ranging from instrumental set-up to obtaining final data and metadata of known quality

  1. MGs permit more flexibility for the way the measurements are conducted than →SOPs. Therefore MGs are used in the case of complex systems that can be set up differently and operated differently in practice.

(mass) mixing ratio #top#

Number of the mass of the target gas (species) per mass of air (possible units are ppmm (also ppmw) = parts per million of air molecules by mass (weight), etc.). A specification whether it refers to dry or moist air is required. See also →recommendations below.

(volume) mixing ratio #top#

Number of molecules of the target gas (species) per definite number of air molecules in unit volume (possible units are ppmv= parts per million of air molecules by volume, ppbv = parts per billion of air molecules by volume, ppt = parts per trillion of air molecules by volume). A specification whether it refers to dry or moist air is required. See also →recommendations below.

  1. ppmv is equal to µmol/mol only in the case that the target gas and the matrix behave ideal. It isn’t a problem for trace gases solely in the gaseous phase at atmospheric pressure but to be really correct you have to show that your gas behaves like an ideal gas and this is a matter of its virial coefficients. Equality is not given for trace gases with a fraction of molecules in the condensed phase.

mole fraction #top#

Relative number of moles of the target gas (species) per mol of the sample mixture (possible units are nmol/mol, µmol/mol, etc.). A specification whether it refers to dry or moist air is required. See also →recommendations below.

quality assurance #top#

all planned and systematic actions necessary to provide adequate confidence that a product, process or service will satisfy given requirements for quality [5]

quality control #top#

operational techniques and activities that are used to fulfil given requirements for quality [5]

reference scale #top#

synonym for →conventional reference scale

(measurement) standard #top#

material measure, measuring instrument, reference material or measuring system intended to define, realize, conserve, or reproduce a unit or one or more values of a quantity to serve as a reference [2]

  1. See →measurement standard for current definition.
  1. In the case of trace gas measurements, generally any gas (natural air or synthetic gas mixture) with assigned mole fractions traceable to an accepted standard scale.
  2. For example, within WMO/GAW, the standard scales for CO2, CH4 and N2O are maintained by NOAA. For other GAW parameters see the respective GAW publications.

standard operating procedure (SOP) #top#

a written document that details the method for a program, operation, analysis, or action with thoroughly prescribed techniques and steps, and that is officially approved as the method for performing certain routine or repetitive tasks [7]

  1. In WMO/GAW, the term is understood to refer to a document that describes the measurement and quality assurance processes involved in obtaining the value of a quantity in as much detail as necessary to be able to achieve stated data quality objectives. For a similar term, see '→measurement procedure'.

standard scale #top#

synonym for →conventional reference scale

surveillance cylinder #top#

synonym for →target cylinder (target gas)

target cylinder (target gas) #top#

cylinder containing natural air or a synthetic gas mixture with assigned trace gas mole fractions that is treated as an (unknown) sample in a sequence of analyses

  1. The target cylinder, or target gas, is used for quality control measures. In the hierarchy of standards the target gas is usually on the same level as a →working standard.
  2. A WMO/GAW →WCC, in spite of its name, does not maintain its own calibration scale, but is linked to the respective →CCL. For a detailed description of the tasks of →WCCs see the WMO/GAW Strategic Plans.

tertiary standard #top#

standard calibrated at the →CCL by comparison with →secondary standards

  1. For trace gases, it is the →CCL tertiary standards that are used as →laboratory standards by the →World Calibration Centres (WCC), GAW stations and participating laboratories.

World Calibration Centre (WCC) #top#

part of the WMO/GAW network, responsible for quality assurance measures for one or more components, by way of →audits and intercomparisons

  1. For each component under consideration, the WCC refers to the calibration scale maintained by the →CCL designated by WMO/GAW.


Explanations & Recommendations #top#

In the following, some of the terms defined above are put into context for the practitioner who struggles with switching from the classical notion of 'systematic and random error' to the modern understanding of 'values, error, and uncertainty'. The reader is encouraged to study →Figure D.1 and →Figure D.2 in the GUM [4]

1) Accuracy #top#

Use the term →accuracy only in relative terms, e.g., to indicate that method X is more accurate (produces less →bias) than method Y. To quantify the deviation of an instrument or analyses from an expected true value, use the terms 'deviation' or →bias, e.g., instrument X is biased (or: deviates) by -1.5 % in comparison to the reference instrument Y.

2) Concentration, Mixing Ratio and Mole Fraction #top#

There is a frequent misunderstanding of the mixing ratio 'units' ppm and ppb. One often reads, for example: The concentration of CO2 in the air is 385 ppm. This is wrong. 385 ppm refers to the dry air mole fraction of CO2 and is not a concentration. Mole fractions (habitually refered to as 'mixing ratios') give the number of molecules of a certain chemical in a fixed number of air molecules. Concentrations, on the other hand, refer to the amount of substance per unit volume, for example, the number of moles of a compound in a fixed volume of air, and are expressed, for example, in units of nmol m-3. See also →numerical quantity value. However, the use of the term mole fraction is recommended as it does not require an implicit assumption of ideality of gases and as it is also applicable to condensed-phase species [10].

  1. ppm (parts per million) = micromole / mole = 10-6, i.e. 1 in 1,000,000
  2. ppb (parts per billion) = nanomole / mole = 10-9 , i.e. 1 in 1,000,000,000

3) Degrees of freedom #top#

The number of degrees of freedom, v, for the mean of a series of independent repeated analyses is simply the number of analyses minus 1. The determination of the number of degrees of freedom can, however, be more complex. The Welch-Satterthwaite formula [4] provides an estimate of the effective degrees of freedom, v_eff.

4) Precision, Repeatability and Reproducibility #top#

Use the term →precision only in relative terms, e.g., to indicate that method X is more precise (produces less spread among the results) than method Y. For repeated observations of the same analyte (sample) with the same instrumentation under unchanged conditions, use the term →repeatability to quantify the spread, e.g., the gas chromatograph X allows determination of methane in a given flask with repeatability of 0.1 % (1 standard deviation). To compare observations of the same analyte (sample) using different instrumentation/methodology or observations made at significantly different times, e.g. on different days, use the term →reproducibility to quantify the spread, e.g., the dry mole fraction of methane in sample X was determined with a reproducibility of 0.5 % (1 standard deviation) using three independent instruments, namely two different GC-FIDs, and a GC-MS.

5) Trueness #top#

Trueness refers to the closeness of agreement between the arithmetic mean of a large number of measurements and the true reference value. Trueness is different to precision as the latter refers to the closeness of agreement between independent measurements.

6) Uncertainty of a measurement #top#

To express the →uncertainty of (a) measurement (i.e., the degree to which a measured result is unknown), use the terms →standard uncertainty (to express the uncertainty in terms of 1 standard deviation), →combined standard uncertainty (the positive square root of the sum of a number of terms contributing to the uncertainty), and →expanded uncertainty (similar to, but not strictly identical to a confidence interval; obtained by multiplying a combined standard uncertainty with a coverage factor). For example, it is recommended to express a measurement result (e.g. of methane in an air sample) in the following way: x = (1793+/-8) ppb (dry air mole fraction, k=2, v=3), where k is the coverage factor selected (k=2 is roughly equivalent to expressing a 95 % confidence interval), and v is the number of degrees of freedom.


References #top#

[1] Working Group 2 of the Joint Committee for Guides in Metrology, International vocabulary of metrology — Basic and general concepts and associated terms (VIM), 3rd edition, Bureau International des Poids et Mésures (BIPM) (Sèvres, France) and International Organization for Standardization (ISO) (Geneva, Switzerland), 104 p. (2008). The HTML version of this document can be found here.

[2] ISO Publications, International vocabulary of basic and general terms in metrology, 2nd edition, International Organization for Standardization (Geneva, Switzerland), (1993). The abbreviation of this title is VIM

[3] ISO Publications, ISO 3534-1, Statistics Vocabulary and symbols Part 1: Probability and general statistical terms, International Organization for Standardization (Geneva, Switzerland) (1993)

[4] Working Group 1 of the Joint Committee for Guides in Metrology, Evaluation of measurement data — Guide to the expression of uncertainty in measurement (GUM), GUM 1995 with minor corrections, Bureau International des Poids et Mésures (BIPM) (Sèvres, France) and International Organization for Standardization (Geneva, Switzerland), 132 p. (2008). This document is considered equivalent to the following document, available from ISO: ISO Publications, ISO/IEC Guide 98-3:2008, Uncertainty of measurement -- Part 3: Guide to the expression of uncertainty in measurement (GUM:1995), International Organization for Standardization (Geneva, Switzerland), (2008). The HTML version of JCGM 100, on which ISO/IEC Guide 98-3:2008 is based, can be found here.

[5] ISO Publications, ISO 8402, Quality Management and quality assurance - Vocabulary, International Organization for Standardization (Geneva, Switzerland) (1994).

[6] WMO (2001), Strategy for the Implementation of the Global Atmosphere Watch Programme (2001 - 2007), GAW Report No. 142, World Meteorological Organization, Geneva, Switzerland

[7] U. S. Environmental Protection Agency, EPA Quality System, Glossary (accessed 2006-11-30)

[8] WMO (2008), WMO Global Atmosphere Watch (GAW) Strategic Plan: 2008-2015, GAW Report No. 172, World Meteorological Organization, Geneva, Switzerland

[9] ISO Publications, ISO 5725, Accuracy (trueness and precision) of measurement methods and results, International Organization for Standardization (Geneva, Switzerland) (1994)

[10] Schwartz S. E., Warneck P., 1995. Units for use in atmospheric chemistry (IUPAC Recommendations 1995). International Journal of Pure and Applied Chemistry 67 (8/9), 1377-1406.


Acknowledgements #top#

The WMO/GAW glossary was compiled and is maintained by QA/SAC Switzerland which is financially supported by →MeteoSwiss and →Empa through MeteoSwiss' →international cooperation program and Empa's →operation of two GAW central facilities.


Version history #top#

last update - 2016-05-26 (Martin Steinbacher): layout improvement, fixing broken links, minor rewording.