Instrument sensitivity
From Wikipedia, the free encyclopedia
| This article does not cite any references or sources. (August 2007) Please help improve this article by adding citations to reliable sources. Unverifiable material may be challenged and removed. |
| This article lacks information on the notability of the subject matter. Please help improve this article by providing context for a general audience, especially in the lead section. (August 2007) |
| The introduction to this article provides insufficient context for those unfamiliar with the subject. Please help improve the article with a good introductory style. |
For a measurement utilizing any quantitative instrument the sensitivity of said instrument is defined as the range of output divided by the range of input.
That is:
Sensitivity = (maximum output - minimum output)/(maximum input - minimum input).
Example:
An amplifier must work in the range of 0mmHg to 200mmHg producing an output voltage of 0 to 2 volts.
Thus the sensitivity is calculated as follows:
Sensitivity = (2V - 0V)/(200mmHg - 0mmHg)
Therefore
Sensitivity = 0.0100V/mmHg
The sensitivity of an instrument can be defined as the slope of the calibration curve. Its units will be output units over the units of the input,

