# Bellman-Ford Shortest Path Algorithm

Walkthrough of Bellmand Ford Algorithm
ðŸ”—
Theory Behind Bellman-Ford Algorithm
ðŸ”—

The Bellman-Ford algorithm is a type of dynamic programming algorithm used to find the shortest paths from a single source vertex to all other vertices in a weighted graph. This algorithm is particularly useful when dealing with graphs that might contain negative-weight edges, which can cause issues for other shortest path algorithms like Dijkstra’s algorithm.

The problem that the Bellman-Ford algorithm aims to solve is the single-source shortest path problem. Given a weighted graph and a designated source vertex, the algorithm calculates the minimum distance from the source vertex to each of the other vertices in the graph. This is done by iteratively relaxing the edges, which means updating the distance estimates for each vertex based on the weights of the edges.

## Weighted Graphs and Shortest Path Problem

A weighted graph is a type of graph where each edge (the connection between two nodes) has an associated numerical value, called a weight. This weight can represent various properties, such as the distance, cost, or time required to travel between the connected nodes.

Here’s an example of a weighted graph represented using ASCII art:

``````   5
A ---- B
| \    |
|  \   |
3   8  -3
|    \ |
|     \|
C ---- D
2
``````

In this example, the graph consists of four nodes (`A`, `B`, `C`, and `D`) and five edges. The numbers associated with each edge represent the weight or cost of traveling between the connected nodes. For example, the edge `A`->`B` has weight `5`, edge `A`->`D` has weight `8` and so on. If you need to travel from `A` to `D`, you can decide on the shortest distance by computing weights of all shortest paths: `A->B->D` has a total weight of 2, `A->D` has a total weight of 8, `A->C->D` has a total weight of 5. So choosing the path `A->B->D` is more efficient.

The above graph can be represented as an adjacency list shown below:

``````graph = {
"A" : [("B", 5), ("C", 3), ("D", 8)],
"B" : [("D", -3)],
"C" : [("D", 2)],
"D" : []
}
``````

Notice that the edge `B->D` has a negative weight `-3` and this can be counterintuitive at first glance. While weights are often associated with positive values like distances or tolls, negative weights can represent unique scenarios that add depth to the model.

For example, consider an electric car navigating a hilly terrain. When the car travels downhill, it can use regenerative braking to recapture energy and extend its range. In this context, a negative weight on an edge could represent a downward slope where the car gains energy instead of expending it. This negative weight effectively increases the estimated distance the car can travel, as it recharges its battery during that segment of the journey. Such a model allows for more accurate route planning and energy management in electric vehicle navigation systems, accounting for both energy consumption on uphill stretches and energy regeneration on downhill portions.

## Bellman-Ford Algorithm and the Intuition Behind it

### Concept of Distance Relaxation

The concept of relaxation is the key to understanding the Bellman-Ford algorithm. Relaxation is the process of updating the distance estimate of a node by considering the distance from the source node to the current node, plus the weight of the edge connecting the current node to the next node. If this combined distance is shorter than the current distance estimate for the next node, the distance estimate is updated.

For example, imagine you’re planning a road trip from town `A` to town `C`. At first, you estimate that the quickest route is a direct road that takes `10` hours. You also know there’s a town `B` that’s `3` hours from `A`. Now, you discover there’s a shortcut from `B` to `C` that only takes `2` hours. This new information makes you rethink your trip. You realize that if you go from `A` to `B` (`3` hours) and then take the shortcut from `B` to `C` (`2` hours), the whole journey would only take `5` hours instead of `10`. This process of finding a better route and updating your travel plan and time estimate is exactly what “relaxation” means in the Bellman-Ford algorithm. You’re relaxing your original assumption about the best path when you find a quicker way. In this case, you relaxed the path to `C` from `10` hours down to `5` hours by going through `B`. This is how the Bellman-Ford algorithm gradually discovers shorter paths in a network, always on the lookout for better routes.

### Using Repeated Relaxation to Improve Distance Estimates

Let’s use a simple square graph to understand how we use the concept of relaxation in the Bellman-Ford algorithm:

``````A --- 3 --- B
|           |
4           2
|           |
D --- 1 --- C
``````

In the above graph,

• We have 4 nodes: `A`, `B`, `C`, and `D`
• The edges are: `A-B` (weight 3), `B-C` (weight 2), `C-D` (weight 1), and `D-A` (weight 4)
• Let’s say node `A` is the source node from which we need to find shortest distances to all other nodes.
1. Initially, we assume that the distances from source node to each other node in the graph as `âˆž` (Infinity). This is how our initial distance estimates from source node `A` look: `{A: 0, B: âˆž, C: âˆž, D: âˆž}`
2. First round of relaxation considering all edges:
• Check edge `A->B`: Current estimated distance from source node `A` to `B` (`âˆž`) > Distance from source node to `A` (`0`) + Weight of `A->B` (`2`). So we relax this edge and update `B`’s distance: `{A: 0, B: 2, C: âˆž}`
• Check edge `A->C`: Current estimated distance from source node `A` to `C` (âˆž) = Distance to `A` (0) + Weight of `A->C` (âˆž). No improvement in distance, so no update.
• Check edge `A->D`: Current estimated distance from source node `A` to `D` (âˆž) > Distance to `A` (0) + Weight of `A->D` (4). So we relax this edge and update `D`’s distance: `{A: 0, B: 3, C: âˆž, D: 4}`
• Check edge `B->C`: Current estimated distance from source node `A` to `C` (âˆž) > Distance to `B` (3) + Weight of `B->C` (2). So we relax this edge and update `C`’s distance: `{A: 0, B: 3, C: 5, D: 4}`
• Check edge `C->D`: Current estimated distance from source node `A` to `D` (4) < Distance to `C` (5) + Weight of `C->D` (1). No improvement in distance, so no update.
3. Second relaxation:
• Check all edges again. No changes are made as all current distances are optimal.
4. Third relaxation:
• Again, no changes are made.

After three rounds of relaxation (equal to the number of nodes minus one), we have found the shortest paths from the source node `A` to all other nodes. The final distances are: `{A: 0, B: 3, C: 5, D: 4}`

In each relaxation step, you can visualize as if the algorithm is pushing/cascading the shortest estimate from the source node to at least one node forward. And since the maximum number of edges in the shortest path from the source node to any other node can be at most `V-1`, where `V` is the number of nodes in the graph. Performing the relaxation process for a maximum of `V-1` times is guaranteed to set the distance estimates to lowest distance possible.

### Bellman-Ford Algorithm

Putting it all together, below is the steps in Bellman Ford shortest path algorithm:

1. Start by assuming all nodes are infinitely (âˆž) far from the source, except the source itself (distance `0`).
2. The algorithm then does `V-1` “passes” through all the edges in the graph. Below steps are performed in each pass:
• For each edge, it checks if using this edge would result in a shorter path to the destination node.
• If so, it updates (relaxes) the distance to that node.
3. Why `V-1` passes? Because in the worst case, the shortest path might need to go through all `V-1` edges to reach the farthest node.
4. With each pass, the algorithm discovers and propagates shorter paths, like ripples spreading through the graph.
5. After `V-1` passes, the algorithm is guaranteed to have found the shortest paths to all reachable nodes. (Given the graph doesn’t have any negative cycles)

## Implementation of Bellman-Ford Algorithm

``````
import math

# Function to find the minimum distance paths
# From source node to all other nodes in the graph
def bellman_ford(graph, source_node):
# Initialize distance estimates from source node
# to each node in the graph to infinity,
# except for the source node which is 0
distance_from_source = {}
for node in graph:
if node == source_node:
distance_from_source[node] = 0
else:
distance_from_source[node] = math.inf

# Initialize parent dictionary to store
# parent information along shortest path
parent = {}

# Relax edges n-1 times, where n is the number of nodes
for i in range(1, len(graph)):
for u, neighbors in graph.items():
for v, w in neighbors:
# Relax the edge if distance to v can be improved
if distance_from_source[u] + w < distance_from_source[v]:
distance_from_source[v] = distance_from_source[u] + w
parent[v] = u

# Return the minimum distance and previous node for each node
return distance_from_source, parent

graph = {
"A" : [("B", 5), ("C", 3), ("D", 8)],
"B" : [("D", -3)],
"C" : [("D", 2)],
"D" : []
}

distances_from_source, parents_along_shortest_path = bellman_ford(graph, "A")
print("Shortest distances from source node A are:")
print(distances_from_source)
``````

In the above code `u` and `v` refers to two nodes in the graph and `w` refers to the weight between them. If we feel that the path between source node and node `v` is better through node `u`, we relax the distance estimate made for node `v`. This is reflected in the condition: `if distances_from_source[u] + w < distances_from_source[v]`.

### Optimization

The standard Bellman Ford algorithm shown above can be optimized by terminating the loop as soon as it is clear that no more updates to the distance estimates are possible. This can be done by keeping track of whether any distance estimates were updated during the previous iteration. If no updates were made, the algorithm can terminate, as it has found the shortest paths. Below is the pseudocode which illustrates how to implement the optimization:

``````// Main loop
for i = 1 to |V| - 1:  // |V| is the number of vertices
updated = false  // Flag to track if any updates occurred

// Relax all edges
for each edge (u, v) with weight w in Graph:
if distance[u] + w < distance[v]:
distance[v] = distance[u] + w
predecessor[v] = u
updated = true  // Mark that an update occurred

// If no updates occurred, we can terminate early
if not updated:
break  // Exit the main loop
``````

## Detecting Negative Cycles

A negative cycle is a cycle in the graph where the sum of the weights of the edges in the cycle is negative. This creates a situation where you could theoretically keep traversing the cycle to reduce the path length indefinitely.

Theoretically, for a graph without any negative cycles, no distance estimate should reduce in Bellman-Ford algorithm after `V-1` relaxation loops. We can then perform a `V`th iteration and if any distance estimate improves in this iteration, it definitely means that there is a negative cycle in the graph. This is because, if the distance estimate improved after `V`th iteration, it means there are `V` edges along the shortest path which could happen only if there is a negative cycle.

## Time & Space Complexity

### Time Complexity

The time complexity of the Bellman-Ford algorithm is `O(V * E)`, where `V` is the number of vertices and `E` is the number of edges in the graph. This is because the algorithm works by iterating through all the edges in the graph `V-1` times, where `V` is the number of vertices. In each iteration, the algorithm updates the distance of each vertex from the source vertex, based on the weights of all the edges `E`. This process continues until the distances cannot be further improved, or until a negative-weight cycle is detected.

### Space Complexity

The space complexity of the Bellman-Ford algorithm is `O(V)`, where `V` is the number of vertices in the graph. This is because the algorithm requires the use of a distance array, which stores the estimates of current distance of each vertex from the source vertex. This array/dictionary has a size of `V`, as there are `V` vertices in the graph.

Additionally, the algorithm may also use a predecessor/parent array to keep track of the previous vertex in the shortest path to each vertex. This array also has a size of `V`.