This subclause treats the case where all input quantities are **independent** (C.3.7). The
case where two or more input quantities are related, that is, are interdependent or **correlated**
(C.2.8), is discussed in 5.2.

5.1.1 The standard uncertainty of
*y*,
where
*y*
is the estimate of the measurand
*Y*
and thus the result of the measurement, is obtained by appropriately combining the standard uncertainties of the input
estimates
*x*_{1}, *x*_{2}, ..., *x*_{N}
(see 4.1). This *combined standard uncertainty* of the estimate
*y*
is denoted by
*u*_{c}(*y*).

NOTE For reasons similar to those given in the note to
4.3.1, the symbols
*u*_{c}(*y*)
and
*u*^{2}_{c}(*y*)
are used in all cases.

5.1.2 The combined standard uncertainty
*u*_{c}(*y*)
is the positive square root of the combined variance
*u*^{2}_{c}(*y*),
which is given by

(10)

where
Equation (10) and its counterpart for correlated input quantities,
Equation (13), both of which are based on a first‑order Taylor series approximation of
*Y* = *f*(*X*_{1}, *X*_{2}, ..., *X*_{N}),
express what is termed in this *Guide* the *law of propagation of uncertainty* (see
E.3.1 and E.3.2).

NOTE When the nonlinearity of
*f*
is significant, higher‑order terms in the Taylor series expansion
must be included in the expression for
*u*^{2}_{c}(*y*),
Equation (10). When the distribution of each
*X*_{i}
is normal, the most important terms of next highest order to be added to the terms of
Equation (10) are

See H.1 for an example of a situation where the contribution of higher-order terms to
*u*^{2}_{c}(*y*)
needs to be considered.

5.1.3 The partial derivatives
∂*f*
⁄∂*x*_{i}
are equal to
∂*f*
⁄∂*X*_{i}
evaluated at
*X*_{i} = *x*_{i}
(see Note 1 below). These derivatives, often called sensitivity coefficients, describe how the
output estimate
*y*
varies with changes in the values of the input estimates
*x*_{1}, *x*_{2}, ..., *x*_{N}.
In particular, the change in
*y*
produced by a small change
Δ*x*_{i}
in input estimate
*x*_{i}
is given by
(Δ*y*)_{i} = (∂*f*⁄∂*x*_{i})(Δ*x*_{i}).
If this change is generated by the standard uncertainty of the estimate
*x*_{i},
the corresponding variation in
*y*
is
(∂*f*
⁄
∂*x*_{i})*u*(*x*_{i}).
The combined variance
*u*^{2}_{c}(*y*)
can therefore be viewed as a sum of terms, each of which represents the estimated variance associated with the output estimate
*y*
generated by the estimated variance associated with each input estimate
*x*_{i}.
This suggests writing Equation (10) as

(11a)

where
(11b)

NOTE 1 Strictly speaking, the partial derivatives are
∂*f*⁄∂*x*_{i} = ∂*f*⁄∂*X*_{i}
evaluated at the expectations of the
*X*_{i}.
However, in practice, the partial derivatives are estimated by

NOTE 2 The combined standard uncertainty
*u*_{c}(*y*)
may be calculated numerically by replacing
*c*_{i}*u*(*x*_{i})
in Equation (11a) with

That is,
*u*_{i}(*y*)
is evaluated numerically by calculating the change in
*y*
due to a change in
*x*_{i}
of
+*u*(*x*_{i})
and of
−*u*(*x*_{i}).
The value of
*u*_{i}(*y*)
may then be taken as
│*Z*_{i}│
and the value of the corresponding sensitivity coefficient
*c*_{i}
as
*Z*_{i}⁄*u*(*x*_{i}).

EXAMPLE For the example of 4.1.1, using the same symbol for both the quantity and its estimate for simplicity of notation,

and

5.1.4 Instead of being calculated from the function
*f*,
sensitivity coefficients
∂*f*⁄
∂*x*_{i}
are sometimes determined experimentally: one measures the change in
*Y*
produced by a change in a particular
*X*_{i}
while holding the remaining input quantities constant. In this case, the knowledge of the function
*f*
(or a portion of it when only several sensitivity coefficients are so determined) is accordingly reduced to an empirical
first‑order Taylor series expansion based on the measured sensitivity coefficients.

5.1.5 If Equation (1) for the measurand
*Y*
is expanded about nominal values
*X*_{i,0}
of the input quantities
*X*_{i},
then, to first order (which is usually an adequate approximation),
*Y* = *Y*_{0} + *c*_{1}*δ*_{1} + *c*_{2}*δ*_{2} + ... + *c*_{N}*δ*_{N},
where
*Y*_{0} = *f*(*X*_{1,0}, *X*_{2,0}, ..., *X*_{N,0}),
*c*_{i} = (∂*f*⁄∂*X*_{i})
evaluated at
*X*_{i} = *X*_{i,0},
and
*δ*_{i} = *X*_{i} − *X*_{i,0}.
Thus, for the purposes of an analysis of uncertainty, a measurand is usually approximated by a linear function of its
variables by transforming its input quantities from
*X*_{i}
to
*δ*_{i}
(see E.3.1).

EXAMPLE From Example 2 of 4.3.7,
the estimate of the value of the measurand
*V*
is
*V* = *V ^{‾‾}* + Δ

and the combined standard uncertainty is
*u*_{c}(*V*) = 15 µV,
which corresponds to a relative combined standard uncertainty
*u*_{c}(*V*)
⁄*V*
of
16 × 10^{−6}
(see 5.1.6). This is an example of the case where the measurand is already a linear function of
the quantities on which it depends, with coefficients
*c*_{i} = +1.
It follows from Equation (10) that if
*Y* = *c*_{1}*X*_{1} + *c*_{2}*X*_{2} + ... + *c*_{N}*X*_{N}
and if the constants
*c*_{i} = +1
or
−1,
then
*u*^{2}_{c}(*y*) = ∑^{N}_{i=1}*u*^{2}(*x*_{i}).

5.1.6 If
*Y*
is of the form
*Y* = *cX*_{1}^{p1}*X*_{2}^{p2} ... *X*_{N}^{pN}
and the exponents
*p*_{i}
are known positive or negative numbers having negligible uncertainties, the combined variance,
Equation (10), can be expressed as

(12)

This is of the same form as Equation (11a) but with the combined variance
*u*^{2}_{c}(*y*)
expressed as a *relative combined variance*
[*u*_{c}(*y*)⁄*y*]^{2}
and the estimated variance
*u*^{2}(*x*_{i})
associated with each input estimate expressed as an estimated *relative variance*
[*u*(*x*_{i})⁄*x*_{i}]^{2}.
[The *relative combined standard uncertainty* is
*u*_{c}(*y*)⁄│*y*│
and the *relative standard uncertainty* of each input estimate is
*u*(*x*_{i})⁄│*x*_{i}│,
│*y*│ ≠ 0
and
│*x*_{i}│ ≠ 0.]

NOTE 1 When
*Y*
has this form, its transformation to a linear function of variables (see 5.1.5) is readily achieved
by setting
*X*_{i} = *X*_{i,0}(1 + *δ*_{i}),
for then the following approximate relation results:
(*Y* − *Y*_{0})⁄*Y*_{0} = ∑^{N}_{i=1}* p*_{i}*δ*_{i}.
On the other hand, the logarithmic transformation
*Z* = ln *Y*
and
*W*_{i} = ln *X*_{i}
leads to an exact linearization in terms of the new variables:
*Z* = ln *c* + ∑^{N}_{i=1}* p*_{i}*W*_{i}.

NOTE 2 If each
*p*_{i}
is either
+1
or
−1,
Equation (12) becomes
[*u*_{c}(*y*)⁄*y*]^{2} = ∑^{N}_{i=1}[*u*(*x*_{i})⁄*x*_{i}]^{2},
which shows that, for this special case, the relative combined variance associated with the estimate
*y*
is simply equal to the sum of the estimated relative variances associated with the input estimates
*x*_{i}.

5.2.1 Equation (10) and those derived from it such as
Equations (11a) and (12) are valid only if the input quantities
*X*_{i}
are independent or uncorrelated (the random variables, not the physical quantities that are assumed to be
invariants — see 4.1.1, Note 1). If some
of the
*X*_{i}
are significantly correlated, the correlations must be taken into account.

5.2.2 When the input quantities are correlated, the appropriate expression for the combined variance
*u*^{2}_{c}(*y*)
associated with the result of a measurement is

(13)

where
(14)

where
In terms of correlation coefficients, which are more readily interpreted than covariances, the covariance term of Equation (13) may be written as

(15)

Equation (13) then becomes, with the aid of Equation (11b),

(16)

NOTE 1 For the very special case where *all* of the input estimates are correlated with
correlation coefficients
*r*(*x*_{i}, *x*_{j}) = +1,
Equation (16) reduces to

The combined standard uncertainty
*u*_{c}(*y*)
is thus simply a *linear sum* of terms representing the variation of the output estimate
*y*
generated by the standard uncertainty of each input estimate
*x*_{i}
(see 5.1.3). [This linear sum should not be confused with the general law of error propagation
although it has a similar form; standard uncertainties are not errors (see E.3.2).]

EXAMPLE Ten resistors, each of nominal resistance
*R*_{i} = 1 000 Ω,
are calibrated with a negligible uncertainty of comparison in terms of the same
1 000 Ω
standard resistor
*R*_{s}
characterized by a standard uncertainty
*u*(*R*_{s}) = 100 mΩ
as given in its calibration certificate. The resistors are connected in series with wires having negligible resistance
in order to obtain a reference resistance
*R*_{ref}
of nominal value
10 kΩ.
Thus
*R*_{ref} = *f*(*R*_{i}) = ∑^{10}_{i=1}*R*_{i}.
Since
*r*(*x*_{i}, *x*_{j}) = *r*(*R*_{i}, *R*_{j}) = +1
for each resistor pair (see F.1.2.3, Example 2),
the equation of this note applies. Since for each resistor
∂*f*⁄∂*x*_{i} = ∂*R*_{ref}⁄∂*R*_{i} = 1,
and
*u*(*x*_{i}) = *u*(*R*_{i}) = *u*(*R*_{s})
(see F.1.2.3, Example 2), that equation
yields for the combined standard uncertainty of
*R*_{ref},
*u*_{c}(*R*_{ref}) = ∑^{10}_{i=1}*u*(*R*_{s}) = 10 × (100 mΩ) = 1 Ω.
The result
*u*_{c}(*R*_{ref}) = [∑^{10}_{i=1}*u*^{2}(*R*_{s})]^{1/2} = 0,32 Ω
obtained from Equation (10) is incorrect because it does not take into account that all of the
calibrated values of the ten resistors are correlated.

NOTE 2 The estimated variances
*u*^{2}(*x*_{i})
and estimated covariances
*u*(*x*_{i}, *x*_{j})
may be considered as the elements of a covariance matrix with elements
*u*_{ij}.
The diagonal elements
*u*_{ii}
of the matrix are the variances
*u*^{2}(*x*_{i})
while the off‑diagonal elements
*u*_{ij}(*i* ≠ *j*)
are the covariances
*u*(*x*_{i}, *x*_{j}) = *u*(*x*_{j}, *x*_{i}).
If two input estimates are uncorrelated, their associated covariance and the corresponding elements
*u*_{ij}
and
*u*_{ji}
of the covariance matrix are 0. If the input estimates are all uncorrelated, all of the off‑diagonal elements are
zero and the covariance matrix is diagonal. (See also C.3.5.)

NOTE 3 For the purposes of numerical evaluation, Equation (16) may be written as

where
*Z*_{i}
is given in 5.1.3, Note 2.

NOTE 4 If the
*X*_{i}
of the special form considered in 5.1.6 are correlated, then the terms

must be added to the right‑hand side of Equation (12).

5.2.3 Consider two arithmetic means
*q‾ *
and
*r‾ *
that estimate the expectations
*μ*_{q}
and
*μ*_{r}
of two randomly varying quantities
*q*
and
*r*,
and let
*q‾ *
and
*r‾ *
be calculated from
*n*
independent pairs of simultaneous observations of
*q*
and
*r*
made under the same conditions of measurement (see B.2.15). Then the covariance (see
C.3.4) of
*q‾ *
and
*r‾ *
is estimated by

(17)

where
Thus the estimated covariance of two correlated input quantities
*X*_{i}
and
*X*_{j}
that are estimated by the means
*X ^{‾‾}*

NOTE Examples where it is necessary to use covariances as calculated from Equation (17) are given in H.2 and H.4.

5.2.4 There may be significant correlation between two input quantities if the same measuring
instrument, physical measurement standard, or reference datum having a significant standard uncertainty is used in their
determination. For example, if a certain thermometer is used to determine a temperature correction required in the
estimation of the value of input quantity
*X*_{i},
and the same thermometer is used to determine a similar temperature correction required in the estimation of input quantity
*X*_{j},
the two input quantities could be significantly correlated. However, if
*X*_{i}
and
*X*_{j}
in this example are redefined to be the uncorrected quantities and the quantities that define the calibration curve for the
thermometer are included as additional input quantities with independent standard uncertainties, the correlation between
*X*_{i}
and
*X*_{j}
is removed. (See F.1.2.3 and F.1.2.4 for further
discussion.)

5.2.5 Correlations between input quantities cannot be ignored if present and significant. The associated covariances should be evaluated experimentally if feasible by varying the correlated input quantities (see C.3.6, Note 3), or by using the pool of available information on the correlated variability of the quantities in question (Type B evaluation of covariance). Insight based on experience and general knowledge (see 4.3.1 and 4.3.2) is especially required when estimating the degree of correlation between input quantities arising from the effects of common influences, such as ambient temperature, barometric pressure, and humidity. Fortunately, in many cases, the effects of such influences have negligible interdependence and the affected input quantities can be assumed to be uncorrelated. However, if they cannot be assumed to be uncorrelated, the correlations themselves can be avoided if the common influences are introduced as additional independent input quantities as indicated in 5.2.4.