Suppose we draw independent samples from a normal distribution with unknown mean and variance . The Fisher information quantifies how much data tell us about these parameters. More formally, it measures the curvature of the log-likelihood function at the true values. Steep curvature means small changes in the parameter drastically affect the likelihood, indicating greater information.
For the normal distribution the log-likelihood is a quadratic form. Differentiating twice and taking expectations yields a symmetric matrix with entries that depend on and . This matrix is called the Fisher information matrix, often written .
When both mean and variance are unknown, the Fisher information matrix for a single observation is
Scaling this by the sample size gives the total information for independent draws. The off-diagonal entries vanish because the mean and variance are orthogonal parameters in the normal family.
Large Fisher information indicates that an unbiased estimator of the parameter will have small variance according to the CramΓ©rβRao bound. Conversely, small information implies that data contain limited guidance. For the normal distribution, increasing the sample size or decreasing the variance raises the information in a straightforward manner.
Because the matrix is diagonal, its inverse is simple to compute. The variance lower bounds for unbiased estimators of the mean and variance are and , respectively.
Imagine and . Plugging these values into the formulas yields a Fisher information matrix with diagonal entries and . This illustrates how easily the information scales with sample size.
Fisher information plays a central role in statistics and information theory. It connects to maximum likelihood estimation, asymptotic normality, and Bayesian priors. In many models, high information corresponds to narrower posterior distributions, meaning the data strongly constrain the parameter values.
Understanding Fisher information helps in experiment design, where one aims to maximize expected information by choosing appropriate sample sizes and measurement precision. It also appears in physics in the study of quantum measurement and thermodynamic uncertainty relations.
Explore how light of different wavelengths scatters in air. Enter a wavelength to compare its scattering intensity relative to green light.
Compute the aerodynamic or hydrodynamic drag force on an object using density, velocity, drag coefficient, and cross-sectional area.
Compute the speed needed to escape a planetary body's gravitational pull.