The Hessian matrix collects all second-order partial derivatives of a scalar-valued function of multiple variables. For a function , the Hessian is
This matrix reveals how curvature behaves near a specific point. When the Hessian is positive definite, the function has a local minimum. If it is negative definite, the point is a local maximum. Indefinite Hessians indicate saddle points. The Hessian thus plays a central role in optimization and analysis.
The calculator uses math.js
to symbolically differentiate the user-provided function. After parsing , the script computes , , and . Because mixed partial derivatives commute for smooth functions, equals . The algorithm evaluates these derivatives at the chosen point to fill in the matrix.
The parser accepts standard mathematical syntax, including exponents and common functions such as sin
, cos
, and exp
. If the expression fails to parse, the calculator presents a helpful message. For well-behaved input, the derivatives are exact, providing a reliable alternative to numerical differentiation, which may suffer from rounding errors or step-size issues.
Second derivatives capture curvature, which governs how gradients change in multidimensional space. In machine learning, the Hessian influences optimization algorithms like Newton's method. In physics, it describes stability of equilibrium solutions. In differential geometry, the Hessian defines the second fundamental form, connecting calculus to surface curvature. By easily computing this matrix, you can explore how functions behave beyond simple slopes.
The Hessian is most powerful when combined with the second-derivative test for functions of two variables. After computing , evaluate its determinant . If and the top-left entry is positive, the point is a local minimum. A positive determinant with a negative signals a local maximum. When the determinant is negative, the surface curves in opposing directions, revealing a saddle point. A zero determinant leaves the test inconclusive, and higher-order derivatives or other techniques are needed. This calculator now performs that classification automatically to help interpret the raw second derivatives.
Consider the quadratic bowl . Enter this expression with the point into the calculator. The second partial derivatives are , , and . The resulting matrix has determinant , which is positive. Because is also positive, the calculator concludes the origin is a strict local minimum. Trying the same function at produces identical second derivatives—the classification remains a minimum even though the function value is higher. This example demonstrates that curvature, not absolute height, determines local optimality.
Optimization algorithms, from Newton’s method to quasi-Newton schemes like BFGS, rely on Hessian information to shape their search steps. A well-conditioned Hessian leads to rapid convergence, whereas a poorly conditioned matrix may stall progress. In statistics, the negative inverse of the Hessian approximates the covariance matrix of maximum-likelihood estimators, offering insight into parameter uncertainty. Economists analyze Hessians to classify utility and production functions, while physicists use them to assess the stability of equilibrium points in mechanics and field theory. Because this calculator exposes the numeric Hessian at arbitrary points, it doubles as a teaching aid and a quick diagnostic tool for real-world models.
Another perspective on the Hessian comes from its eigenvalues. These numbers describe how the surface curves along principal directions. When both eigenvalues are positive, the surface bends upward in every direction—hence a minimum. Opposite signs correspond to saddle points with opposing curves. Although this tool reports the determinant and classification rather than full eigenvalue spectra, you can compute the eigenvalues of the 2x2 matrix easily by hand or with a linear algebra package. Understanding these eigenvalues helps anticipate how gradient descent might behave; large magnitude eigenvalues signal steep valleys or ridges that may necessitate step size adjustments.
Symbolic differentiation is powerful yet sensitive. Functions with absolute values, discontinuities, or non-analytic behavior may produce undefined second derivatives. Even smooth functions can generate huge intermediate expressions that slow down computation. If you encounter an error, try simplifying your expression or breaking it into smaller parts. Keep in mind that the second-derivative test assumes you are evaluating the Hessian at a critical point where the gradient is zero. The calculator does not verify this condition; it simply reports curvature. Always examine first derivatives separately when searching for minima or maxima. For functions of more than two variables, the Hessian becomes larger, and definiteness requires analyzing leading principal minors or eigenvalues—ideas that extend naturally from the 2x2 case showcased here.
Enter a function involving and , specify the evaluation point, and press Compute Hessian. The calculator now outputs the full Hessian matrix, its determinant, and a classification of the point based on the second-derivative test. The entries appear as numerical values with four decimal digits. Try varying the point to see how the matrix changes across the surface defined by your function. Experimentation will build intuition for convexity, saddle points, and other features.
Compute the 2x2 Jacobian matrix of two functions of two variables using symbolic differentiation.
Compute the matrix exponential of a 2x2 matrix using a power series.
Approximate first or second derivatives using forward, backward, or central difference formulas and compare them to symbolic results.