A Markov chain describes a system that moves among a finite set of states, where the probability of transitioning to the next state depends only on the current state. The collection of probabilities for moving from state to state forms a transition matrix . Each row of sums to one, reflecting the total probability of leaving a given state. Examples abound in queuing theory, genetics, finance, and computer science. Understanding a chain’s long-term behavior often centers on finding a stationary distribution—a set of probabilities that remains unchanged after repeated transitions.
If is a row vector of probabilities summing to one, it is stationary when . In practice, this means that starting with distribution and applying the transition matrix leaves the distribution unchanged. Many Markov chains converge to such a distribution regardless of the initial state, a property called ergodicity. The stationary distribution reveals the proportion of time the chain spends in each state over the long run.
One way to find the stationary distribution is to raise the transition matrix to higher and higher powers, then multiply by an arbitrary starting distribution. For regular chains this process converges to the stationary distribution. The calculator uses this iterative approach for simplicity and stability. Starting with a uniform distribution , it repeatedly computes . When the difference between successive distributions becomes smaller than a tolerance, the process stops and the result is displayed.
Stationary distributions are more than just mathematical curiosities. In web page ranking, for instance, the PageRank algorithm models user navigation as a Markov chain and computes the steady state to determine page importance. In genetics, Markov chains predict allele frequencies under random mating. In finance, they model credit ratings or market regimes. Because the stationary distribution captures long-term tendencies, it provides insight into equilibrium behavior across many disciplines.
Fill in the transition probabilities. If you leave the third row and column blank, the calculator assumes a 2×2 chain. Be sure each row sums to one; otherwise the model does not represent a valid Markov process. After clicking the button, the script normalizes the rows, initializes a uniform distribution, and iterates up to 1000 steps or until the change is tiny. The final probabilities appear rounded to four decimal places, giving you the stationary distribution.
Consider a two-state chain with . Starting from either state, the system eventually spends approximately of the time in state 1 and in state 2. The calculator reproduces this distribution by iterating the uniform vector under repeated multiplication.
The code treats the matrix as a nested array. At each iteration it computes by a simple double loop. It then checks the sum of absolute differences from the previous distribution. When that sum falls below , convergence is declared. This procedure works reliably for small matrices that are irreducible and aperiodic. While more sophisticated methods exist for large chains, the iterative approach provides an intuitive glimpse of how Markov processes settle into equilibrium.
Named after the Russian mathematician Andrey Markov, these chains were initially studied in the early twentieth century as models of dependent events. Over time they have become fundamental tools in probability theory and statistical mechanics. The concept of a stationary distribution lies at the heart of the ergodic theorem, which roughly states that time averages equal ensemble averages for many random processes. By experimenting with this calculator, you engage with a century-old mathematical tradition that continues to inform modern data science and physics.
When constructing your own transition matrices, verify that the entries are non-negative and that each row sums to one. Chains that violate these conditions can yield misleading results. If a chain is not ergodic—for example, if it splits into separate communicating classes—the iterative method might converge to a distribution that depends on the initial state. In such cases, analyzing the chain’s structure or computing eigenvectors of may be necessary. For many small, well-behaved chains, however, this calculator offers a quick way to see long-run trends.
Compute photon energy from wavelength or frequency using Planck's relation. Learn how light's quantum nature connects energy to electromagnetic waves.
Determine the molar mass of a compound by entering its chemical formula. Useful for stoichiometry and laboratory preparations.
Determine the wavelength of maximum emission for a blackbody at any temperature using Wien's displacement law.