This calculator helps you solve constrained optimization problems of the form
maximize / minimize f(x, y) subject to g(x, y) = 0.
It is aimed at students working on calculus homework, as well as practitioners in economics, engineering, and physics who need a quick numerical check. The tool uses the method of Lagrange multipliers to locate stationary points that satisfy both the objective and the constraint.
After you enter your functions and (optionally) initial guesses, the solver attempts to
find values (x*, y*) and a multiplier λ such that the necessary
conditions for a constrained extremum are satisfied.
Suppose you want to optimize a differentiable function f(x, y)
subject to a differentiable constraint g(x, y) = 0. The method introduces an
auxiliary variable λ and forms the Lagrangian
L(x, y, λ) = f(x, y) + λ g(x, y).
At a constrained extremum, the gradients of f and g are parallel.
This leads to the system of equations
together with the original constraint
g(x, y) = 0. Equivalently, we set the partial derivatives of the Lagrangian
to zero:
∂L/∂x = 0∂L/∂y = 0∂L/∂λ = g(x, y) = 0Solving this system gives candidate points that may be maxima, minima, or saddle points along the constraint curve.
Fill in the fields using the variables x and y:
x*y^2, x^2 + y^2, sin(x) + y^2.
x + y - 3, x^2 + y^2 - 1.
Acceptable syntax follows standard programming-style notation:
* for multiplication (e.g., x*y).^ for powers (e.g., y^2, x^3).(x - 1)^2 + (y + 2)^2.After entering your expressions, click the Solve button. The tool uses symbolic differentiation and a numerical root-finder to solve the first-order conditions.
Consider maximizing or minimizing
f(x, y) = x*y^2 subject to the linear constraint
g(x, y) = x + y - 3 = 0.
x*y^2x + y - 31 and
Initial y = 2, which already satisfy
x + y ≈ 3.
L(x, y, λ) = x*y^2 + λ (x + y - 3).
∂L/∂x = y^2 + λ∂L/∂y = 2*x*y + λ∂L/∂λ = x + y - 3y^2 + λ = 02*x*y + λ = 0x + y - 3 = 0x + y = 3. One of them is
(x*, y*) = (2, 1) with λ = -1. At this point,
f(2, 1) = 2 * 1^2 = 2.
f at other candidate points on the same constraint, you
can see whether f(2, 1) is larger or smaller. Comparing values along the
constraint allows you to classify each stationary point as a constrained maximum,
constrained minimum, or saddle point.
In practice, the calculator performs steps 3–5 internally. You only need to provide the functions and, optionally, the initial guesses.
A successful run typically returns:
f is to small changes in the constraint.f at the
stationary point.To decide whether each point is a maximum or a minimum along the constraint, you can:
f(x*, y*) across multiple solutions returned by the solver.f-values.
Remember that the method finds local stationary points. The largest
f(x*, y*) among all feasible candidates is the constrained maximum, and the
smallest is the constrained minimum on the specified constraint.
| Aspect | Manual Lagrange multiplier method | Using this calculator |
|---|---|---|
| Derivatives | You compute ∂L/∂x, ∂L/∂y, ∂L/∂λ by hand. |
Symbolic differentiation is done automatically. |
| Solving equations | You solve the nonlinear system yourself (often algebra-heavy). | A numerical solver searches for roots of the first-order conditions. |
| Speed | Can be slow and error-prone, especially with messy functions. | Very fast once expressions are entered correctly. |
| Transparency | Every algebraic step is visible and instructive. | Best for checking results or exploring, not for showing detailed steps. |
| Multiple solutions | You must systematically explore all possible cases. | Try different initial guesses to discover additional stationary points. |
| Scope | Extends in principle to higher dimensions and more constraints. | This tool focuses on two variables and one equality constraint. |
To use this calculator effectively, keep these points in mind:
f(x, y) and g(x, y)
should be differentiable in the region of interest so that gradients exist.
g(x, y) = 0 in two variables.
∇g(x, y) is nonzero at the solution. If the constraint gradient vanishes
there, the method may fail or return misleading results.
For deeper study, you may also want to review related topics such as gradients, unconstrained optimization, and second-order conditions for maxima and minima.