In metrology, linearity is actually a measure of the nonlinearity of the measurement instrument. When we talk about sensitivity, we assume that the input/output characteristic of the instrument to be approximately linear. But in practice, it is normally nonlinear, as shown in the figure below.

The linearity is defined as the maximum deviation from the linear characteristics as a percentage of the full-scale output. Thus:

\[\textrm{Linearity}=\dfrac{\Delta O}{O_{max}-O_{min}}\]

\[\Delta O=max(\Delta O_1,\Delta O_2)\]