Dijkstra's Algorithm in One Sitting
by ada · 2026-05-01 · edited 2026-05-01
Dijkstra's Algorithm in One Sitting
Shortest path in a graph with non-negative edge weights. The negative-weight version is Bellman–Ford and is a different beast.
The idea: at every step, expand the node with the smallest currently-known distance from the source. Once you expand it, that distance is final — no shorter path can be found later, because all remaining edges have non-negative weight.
import heapq
def dijkstra(graph, start):
dist = {start: 0}
pq = [(0, start)] # (distance, node)
while pq:
d, u = heapq.heappop(pq)
if d > dist[u]:
continue # stale entry — we already found a shorter way
for v, w in graph[u]:
nd = d + w
if nd < dist.get(v, float("inf")):
dist[v] = nd
heapq.heappush(pq, (nd, v))
return dist
graph[u] is a list of (neighbor, weight) pairs.
The "stale entry" trick
The heap can hold multiple (d, u) entries for the same node — one per time we improved its distance. Rather than fishing the old entry out (which heaps don't support efficiently), we leave it. When it eventually pops, the if d > dist[u]: continue check throws it away.
That makes the per-edge work O(log E) instead of O(log V) for a true decrease-key heap, but it's much simpler and runs faster in practice on sparse graphs.
Why it needs non-negative edges
Dijkstra's correctness rests on this monotonicity: when we pop a node u with distance d, every unprocessed node has a tentative distance ≥ d. With a negative edge, a path through some "later" node could shrink dist[u] after we'd already finalized it. The whole proof breaks.
For graphs with negative edges, use Bellman–Ford (slower, O(VE)) or, for DAGs, topologically sort and relax in order.
Cost
O((V + E) · log V) with a binary heap. On a sparse graph (E ≈ V) that's O(V log V). On a dense graph (E ≈ V²) you can do better with a Fibonacci heap or even by scanning, but you'll rarely meet a problem where it matters.