Fermat's theorem (stationary points)

From Wikipedia, the free encyclopedia

Fermat's theorem is a theorem in real analysis, named after Pierre de Fermat. It gives a method to find local maxima and minima of differentiable functions by showing that every local extremum of the function is a stationary point (the function derivative is zero in that point). So, by using Fermat's theorem, the problem of finding a function extremum is reduced to solving an equation.

It is important to note that Fermat's theorem gives only a necessary condition for extreme function values. That is, some stationary points are not extreme values, they are inflection points. To check if a stationary point is an extreme value and to further distinguish between a function maximum and a function minimum it is necessary to analyse the second derivative (if it exists).

Contents

[edit] Fermat's theorem

Let f\colon (a,b) \rightarrow \mathbb{R} be a function and suppose that \displaystyle x_0 \in (a,b) is a local extremum of \displaystyle f. If \displaystyle f is differentiable at \displaystyle x_0 then \displaystyle f'(x_0) = 0.

[edit] Application to optimization

See also: maxima and minima

As a corollary, global extrema of a function f on a domain A occur only at boundaries, non-differentiable points, and stationary points. If x0 is a global extremum of f, then one of the following is true:

  • boundary: x0 is in the boundary of A
  • non-differentiable: f is not differentiable at x0
  • stationary point: x0 is a stationary point of f

[edit] Intuition

The intuition is based on the behavior of polynomial functions. Assume that function f has a maximum at x0, the reasoning being similar for a function minimum. If \displaystyle x_0 \in (a,b) is a local maximum then there is a (possibly small) neighborhood of \displaystyle x_0 such as the function is increasing before and decreasing after \displaystyle x_0. As the derivative is positive for an increasing function and negative for a decreasing function, \displaystyle f' is positive before and negative after \displaystyle x_0. \displaystyle f' doesn't skip values (by Darboux's theorem), so it has to be zero at some point between the positive and negative values. The only point in the neighbourhood where it is possible to have \displaystyle f'(x) = 0 is \displaystyle x_0.

Note that the theorem (and its proof below) is more general than the intuition in that it doesn't require the function to be differentiable over a neighbourhood around \displaystyle x_0. As stated in the theorem, it is sufficient for the function to be differentiable only in the extreme point.

[edit] Proof

Suppose that \displaystyle x_0 is a local maximum (a similar proof applies if \displaystyle x_0 is a local minimum). Then there \exists \, \delta > 0 such that (x_0 - \delta,x_0 + \delta) \subset (a,b) and such that we have f(x_0) \ge f(x)\, \forall  x with \displaystyle |x - x_0| < \delta . Hence for any h \in (0,\delta) we notice that it holds

\frac{f(x_0+h) - f(x_0)}{h} \le 0.

Since the limit of this ratio as \displaystyle h gets close to 0 from above exists and is equal to \displaystyle f'(x_0) we conclude that f'(x_0) \le 0. On the other hand for h \in (-\delta,0) we notice that

\frac{f(x_0+h) - f(x_0)}{h} \ge 0

but again the limit as \displaystyle h gets close to 0 from below exists and is equal to \displaystyle f'(x_0) so we also have f'(x_0) \ge 0.

Hence we conclude that \displaystyle f'(x_0) = 0.

[edit] See also

[edit] External links