Logistic regression models binary outcomes with the logistic function . Parameters and control the slope and offset. By adjusting them, the sigmoid curve can represent probabilities between 0 and 1 for any value.
Given a set of sample points with either 0 or 1, we determine and by minimizing the negative log-likelihood
The gradients of with respect to and yield simple update rules. Starting with zero parameters, gradient descent iteratively subtracts a fraction of these gradients until convergence. This approach works well for small data sets and illustrates how logistic regression learns to separate classes.
This calculator implements a basic gradient descent optimizer. Input your training points one per line, set a learning rate and iteration count, and click "Fit Model". The algorithm repeatedly computes predicted probabilities, accumulates the gradients, and updates and . After training, the resulting parameters are displayed along with the probability estimate for each input point.
While simple, logistic regression remains a cornerstone of statistical modeling. It forms the basis for classification tasks from medical testing to marketing. Understanding its mechanics helps demystify more complex machine-learning algorithms that extend or generalize it. By experimenting with this tool, you can see firsthand how varying the learning rate or number of iterations affects convergence and final accuracy.
Estimate the Euler–Mascheroni constant using harmonic numbers minus log(n).
Evaluate the probability density of a 3-parameter Dirichlet distribution.
Compute the probability that an energy level is occupied in a solid using Fermi-Dirac statistics.