Hessian Matrix

Embed Size (px)

Citation preview

  • 7/30/2019 Hessian Matrix

    1/4

    Hessian matrix 1

    Hessian matrix

    In mathematics, the Hessian matrix (or simply the Hessian) is the square matrix of second-order partial derivatives

    of a function; that is, it describes the local curvature of a function of many variables. The Hessian matrix was

    developed in the 19th century by the German mathematician Ludwig Otto Hesse and later named after him. Hesse

    himself had used the term "functional determinants".

    Given the real-valued function

    if all second partial derivatives offexist, then the Hessian matrix offis the matrix

    where x = (x1, x

    2, ..., x

    n) and D

    iis the differentiation operator with respect to the ith argument and the Hessian

    becomes

    Becausefis often clear from context, is frequently shortened to simply .

    The Hessian matrix is related to the Jacobian matrix by, = .

    Some mathematicians

    [1]

    define the Hessian as the determinant of the above matrix.Hessian matrices are used in large-scale optimization problems within Newton-type methods because they are the

    coefficient of the quadratic term of a local Taylor expansion of a function. That is,

    whereJis the Jacobian matrix, which is a vector (the gradient) for scalar-valued functions. The full Hessian matrix

    can be difficult to compute in practice; in such situations, quasi-Newton algorithms have been developed that use

    approximations to the Hessian. The best-known quasi-Newton algorithm is the BFGS algorithm.

    Mixed derivatives and symmetry of the Hessian

    The mixed derivatives offare the entries offthe main diagonal in the Hessian. Assuming that they are continuous,

    the order of differentiation does not matter (Clairaut's theorem). For example,

    This can also be written as:

    In a formal statement: if the second derivatives offare all continuous in a neighborhoodD, then the Hessian offis a

    symmetric matrix throughoutD; see symmetry of second derivatives.

    http://en.wikipedia.org/w/index.php?title=Symmetry_of_second_derivativeshttp://en.wikipedia.org/w/index.php?title=Symmetric_matrixhttp://en.wikipedia.org/w/index.php?title=Neighbourhood_%28mathematics%29http://en.wikipedia.org/w/index.php?title=Continuous_functionhttp://en.wikipedia.org/w/index.php?title=Symmetry_of_second_derivatives%23Clairaut.27s_theoremhttp://en.wikipedia.org/w/index.php?title=Main_diagonalhttp://en.wikipedia.org/w/index.php?title=BFGShttp://en.wikipedia.org/w/index.php?title=Quasi-Newton_methodhttp://en.wikipedia.org/w/index.php?title=Gradienthttp://en.wikipedia.org/w/index.php?title=Jacobian_matrixhttp://en.wikipedia.org/w/index.php?title=Taylor_expansionhttp://en.wikipedia.org/w/index.php?title=Newton%27s_method_in_optimizationhttp://en.wikipedia.org/w/index.php?title=Optimization_%28mathematics%29http://en.wikipedia.org/w/index.php?title=Determinanthttp://en.wikipedia.org/w/index.php?title=Jacobian_matrixhttp://en.wikipedia.org/w/index.php?title=Partial_derivativehttp://en.wikipedia.org/w/index.php?title=Real_numberhttp://en.wikipedia.org/w/index.php?title=Ludwig_Otto_Hessehttp://en.wikipedia.org/w/index.php?title=Germanyhttp://en.wikipedia.org/w/index.php?title=Function_%28mathematics%29http://en.wikipedia.org/w/index.php?title=Partial_derivativehttp://en.wikipedia.org/w/index.php?title=Square_matrixhttp://en.wikipedia.org/w/index.php?title=Mathematics
  • 7/30/2019 Hessian Matrix

    2/4

    Hessian matrix 2

    Critical points and discriminant

    If the gradient off (i.e. its derivative in the vector sense) is zero at some point x, then f has a critical point (or

    stationary point) at x. The determinant of the Hessian at x is then called the discriminant. If this determinant is zero

    then x is called a degenerate critical pointoff, this is also called a non-Morse critical pointoff. Otherwise it is

    non-degenerate, this is called aMorse critical pointoff.

    Second derivative test

    The following test can be applied at a non-degenerate critical point x. If the Hessian is positive definite at x, then f

    attains a local minimum at x. If the Hessian is negative definite at x, then f attains a local maximum at x. If the

    Hessian has both positive and negative eigenvalues then x is a saddle point forf(this is true even ifx is degenerate).

    Otherwise the test is inconclusive.

    Note that for positive semidefinite and negative semidefinite Hessians the test is inconclusive (yet a conclusion can

    be made that f is locally convex or concave respectively). However, more can be said from the point of view of

    Morse theory.

    In view of what has just been said, the second derivative test for functions of one and two variables is simple. In one

    variable, the Hessian contains just one second derivative; if it is positive then x is a local minimum, if it is negative

    thenx is a local maximum; if it is zero then the test is inconclusive. In two variables, the determinant can be used,

    because the determinant is the product of the eigenvalues. If it is positive then the eigenvalues are both positive, or

    both negative. If it is negative then the two eigenvalues have different signs. If it is zero, then the second derivative

    test is inconclusive.

    More generally, the second-order conditions that are sufficient for a local minimum or maximum can be expressed in

    terms of the sequence of principal (upper-leftmost) minors (determinants of sub-matrices) of the Hessian; these

    conditions are a special case of those given in the next section for bordered Hessians for constrained

    optimizationthe case in which the number of constraints is zero.

    Bordered Hessian

    A bordered Hessian is used for the second-derivative test in certain constrained optimization problems. Given the

    function as before:

    but adding a constraint function such that:

    the bordered Hessian appears as

    http://en.wikipedia.org/w/index.php?title=Minor_%28linear_algebra%29http://en.wikipedia.org/w/index.php?title=Determinanthttp://en.wikipedia.org/w/index.php?title=Second_derivative_testhttp://en.wikipedia.org/w/index.php?title=Morse_theoryhttp://en.wikipedia.org/w/index.php?title=Concave_functionhttp://en.wikipedia.org/w/index.php?title=Convex_functionhttp://en.wikipedia.org/w/index.php?title=Saddle_pointhttp://en.wikipedia.org/w/index.php?title=Eigenvaluehttp://en.wikipedia.org/w/index.php?title=Positive-definite_matrix%23Negative-definite%2C_semidefinite_and_indefinite_matriceshttp://en.wikipedia.org/w/index.php?title=Positive-definite_matrixhttp://en.wikipedia.org/w/index.php?title=Discriminanthttp://en.wikipedia.org/w/index.php?title=Determinanthttp://en.wikipedia.org/w/index.php?title=Stationary_pointhttp://en.wikipedia.org/w/index.php?title=Critical_point_%28mathematics%29http://en.wikipedia.org/w/index.php?title=Gradient
  • 7/30/2019 Hessian Matrix

    3/4

    Hessian matrix 3

    If there are, say, m constraints then the zero in the north-west corner is an m m block of zeroes, and there are m

    border rows at the top and m border columns at the left.

    The above rules stating that extrema are characterized by a positive definite or negative definite Hessian cannot

    apply here since a bordered Hessian cannot be definite: we have z'Hz = 0 if vector z has a non-zero as its first

    element, followed by zeroes.

    The second derivative test consists here of sign restrictions of the determinants of a certain set of n - m submatricesof the bordered Hessian.

    [2]Intuitively, one can think of the m constraints as reducing the problem to one with n - m

    free variables. (For example, the maximization of subject to the constraint can

    be reduced to the maximization of without constraint.)

    Specifically,[3]

    sign conditions are imposed on the sequence of principal minors (determinants of upper-left-justified

    sub-matrices) of the bordered Hessian, the smallest minor consisting of the truncated first 2m+1 rows and columns,

    the next consisting of the truncated first 2m+2 rows and columns, and so on, with the last being the entire bordered

    Hessian. There are thus nm minors to consider. A sufficient condition for a local maximum is that these minors

    alternate in sign with the smallest one having the sign of (1)m+1

    . A sufficient condition for a local minimum is that

    all of these minors have the sign of (1)m

    . (Note that in the unconstrained case ofm=0 these conditions coincide with

    the conditions for the unbordered Hessian to be negative definite or positive definite respectively.)

    Vector-valued functions

    Iffis instead a function from , i.e.

    then the array of second partial derivatives is not a two-dimensional matrix of size , but rather a tensor of

    order 3. This can be thought of as a multi-dimensional array with dimensions , which degenerates to

    the usual Hessian matrix for .

    Generalizations to Riemannian manifolds

    Let be a Riemannian manifold and its Levi-Civita connection. Let be a smooth

    function. We may define the Hessian tensor

    by ,

    where we have taken advantage of the first covariant derivative of a function being the same as ordinary derivative.

    Choosing local coordinates we obtain the local expression for the Hessian as

    where are the Christoffel symbols of the connection. Other equivalent forms for the Hessian are given by

    and .

    Notes

    External links

    Weisstein, Eric W., " Hessian (http://mathworld.wolfram.com/Hessian.html)" from MathWorld.

    http://en.wikipedia.org/w/index.php?title=MathWorldhttp://mathworld.wolfram.com/Hessian.htmlhttp://en.wikipedia.org/w/index.php?title=Eric_W._Weissteinhttp://en.wikipedia.org/w/index.php?title=Christoffel_symbolshttp://en.wikipedia.org/w/index.php?title=Levi-Civita_connectionhttp://en.wikipedia.org/w/index.php?title=Riemannian_manifoldhttp://en.wikipedia.org/w/index.php?title=Tensor
  • 7/30/2019 Hessian Matrix

    4/4

    Article Sources and Contributors 4

    Article Sources and ContributorsHessian matrix Source: http://en.wikipedia.org/w/index.php?oldid=518927334 Contributors: Amakuha, Arie ten Cate, Baccyak4H, BenFrantzDale, Bhudson, Charibdis, Charles Matthews,

    Commander Nemet, Cruccone, Dingenis, Dmmaus, Dphrag, Duoduoduo, DustinFreeman, Eliashedberg, Falsifian, Fropuff, GTBacchus, Giftlite, Helgus, Hongooi, JLW777, Jakob.scholbach,

    JavOs, Keyi, Kotasik, Linas, MarkSweep, MathMartin, Mathfreq, Mebden, Michael Hardy, Morten Isaksen, Mrwojo, Oli Filth, Prcnarhet arthas, Qwfp, R.e.b., R3m0t, Raulshc, Reinderien,

    Rkghadai, Robinh, Semifinalist, Slicing, Super7luv, Sawomir Biay, Thegeneralguy, Tiresais, Tomaxer, Tosha, Tpl, Wfisher, Wikomidia, 86 anonymous edits

    License

    Creative Commons Attribution-Share Alike 3.0 Unported//creativecommons.org/licenses/by-sa/3.0/