Instrument sensitivity

From Wikipedia, the free encyclopedia

For a measurement utilizing any quantitative instrument the sensitivity of said instrument is defined as the range of output divided by the range of input.

That is:

Sensitivity = (maximum output - minimum output)/(maximum input - minimum input).

Example:

An amplifier must work in the range of 0mmHg to 200mmHg producing an output voltage of 0 to 2 volts.

Thus the sensitivity is calculated as follows:

Sensitivity = (2V - 0V)/(200mmHg - 0mmHg)

Therefore

Sensitivity = 0.0100V/mmHg


The sensitivity of an instrument can be defined as the slope of the calibration curve. Its units will be output units over the units of the input,