PageRank assigns a numerical weight to each node in a directed graph based on the pattern of incoming links. Originally developed by Larry Page and Sergey Brin to rank web pages, the algorithm imagines a random surfer who continually follows hyperlinks. Pages that attract many inbound links or links from influential pages receive higher scores because the surfer is more likely to land on them. In mathematical language, PageRank is the stationary distribution of a Markov chain defined on the network. Each page corresponds to a state, and transitions occur by following links. The damping factor d represents the probability that the surfer continues following links at each step; with probability 1−d the surfer teleports to a random page, ensuring that the chain is ergodic and has a unique stationary distribution.
Suppose there are n pages. Let denote the probability of moving from page j to page i. This matrix is column stochastic: each column sums to one because it represents a probability distribution over destinations when starting from a given page. If a page has no outgoing links, it is treated as linking equally to all pages. The PageRank vector satisfies the linear system
where is the vector of all ones. Intuitively, the new rank of a page equals the damped sum of ranks from linking pages plus a share of the teleportation probability. Solving this equation directly would involve finding the eigenvector associated with eigenvalue one, but for small graphs an iterative method called power iteration is straightforward. Start with an initial rank vector, often uniform, and repeatedly apply the transformation until convergence.
Consider a network of three pages with the following links:
From | To |
---|---|
1 | 2, 3 |
2 | 3 |
3 | 1 |
The adjacency matrix has ones at positions (2,1) and (3,1) because page 1 links to pages 2 and 3, a one at (3,2) because page 2 links to page 3, and a one at (1,3) because page 3 links back to page 1. With damping factor 0.85, iterating the update eventually yields PageRank scores roughly 0.387 for page 1, 0.214 for page 2, and 0.399 for page 3. These scores sum to one and reflect both direct link counts and the importance of linking pages.
Without damping, a network containing a closed loop with no outgoing edges would trap the random surfer, assigning zero rank to pages outside the loop. The teleportation mechanism prevents such rank sinks by occasionally jumping to any page. The value 0.85 became canonical in early Google papers, though other choices are possible. Lower values emphasize the uniform teleportation component, reducing the influence of links, while higher values make the algorithm more sensitive to network structure but slower to converge.
The script behind this calculator constructs the column stochastic matrix M from your input. For each column j it counts the number of outgoing links; if the count is zero, each entry in that column is set to 1/n, otherwise entries become 1/outdegree where links exist. Starting with a uniform rank vector, the algorithm performs 100 iterations of the update rule. Because the network is small, this fixed number of iterations suffices to approximate convergence for most cases. The resulting vector is normalized so its components sum to one, providing an intuitive probability interpretation.
From linear algebra, the PageRank vector is the dominant eigenvector of the Google matrix where is a matrix with all entries equal to one. Because G is stochastic and primitive, the Perron–Frobenius theorem guarantees a unique positive eigenvector. The power method converges linearly at a rate governed by the gap between the dominant eigenvalue (which is one) and the subdominant eigenvalue. Damping increases this gap, improving convergence.
Modern search engines incorporate numerous signals beyond pure link structure, but PageRank remains a foundational idea. Variants include topic-sensitive PageRank, which biases teleportation toward a subset of pages to personalize results, and weighted PageRank, where link weights depend on factors such as click frequency or link position. Some researchers apply similar algorithms to citation networks, social media interactions, or even biological pathways. The general principle—using eigenvectors to quantify influence in a directed network—has widespread applicability.
To use this tool, fill in the adjacency matrix with ones wherever a link exists from the column page to the row page. For example, placing a one in row 2 column 1 indicates a link from page 1 to page 2. Leave zeros where no link exists. Enter a damping factor between 0 and 1, then click “Compute PageRank.” The calculator outputs a vector of five scores. Unused pages can simply have all zero rows and columns; their ranks will reflect only the teleportation component. Because everything runs in the browser, no data leaves your machine.
The numerical scores represent the long-run proportion of time a random surfer spends on each page. They can be compared directly: a page with rank 0.4 is twice as likely to be visited as one with rank 0.2 in the long run. While the scores are probabilities, scaling them by 100 renders them as percentages. A typical visualization is a bar chart or a list sorted from highest to lowest rank. When analyzing larger networks, pages with unexpectedly high PageRank may indicate hubs or authorities within the structure.
Real-world web graphs contain billions of pages, far beyond the five-node limit of this demonstration. At that scale, storing the matrix explicitly is impossible; algorithms work with sparse representations and distribute computations across clusters. Nevertheless, experimenting with small matrices illuminates the mechanics of the algorithm. You can explore how adding or removing a single link shifts the ranking, or how the ranks equalize when every page links to every other page. Such intuition is valuable when designing networks or interpreting link-based metrics.
PageRank emerged in the late 1990s amid the burgeoning growth of the web. Traditional search engines relied heavily on textual content and simple heuristics. Page and Brin recognized that hyperlinks encode collective human judgment: a link from page A to page B is a recommendation. By modeling the web as a graph and applying ideas from Markov chains and spectral theory, they produced rankings that resisted many forms of manipulation and returned remarkably relevant results. The success of Google popularized link analysis and spurred a wave of research into network algorithms.
Whether you are studying graph theory, learning about Markov chains, or curious about the principles behind search engines, PageRank offers a captivating example of mathematics in action. This calculator distills the algorithm to its essentials, letting you manipulate small graphs and observe how scores emerge from link patterns. By examining the random surfer model, damping, and power iteration, you gain insight into how global importance can be inferred from local connections. Experiment with different matrices and damping values to see how even tiny networks exhibit rich behavior.
Compute the chromatic number of a small graph and produce a valid coloring using backtracking.
Estimate received power for a wireless link using transmit power, antenna gains, and path loss. Learn how each factor affects communication reliability.
Compute a topological ordering of a directed acyclic graph using Kahn's algorithm.