Dynamic Programming
Dynamic programming algorithms solve problems by dividing them into smaller subproblems, storing the results of these subproblems, and later combining them to solve the larger problem. These algorithms can solve problems where the optimal solution to the problem is composed of the optimal solutions to its subproblems, i.e., problems with optimal substructure.
Greedy algorithms solve problems by choosing the option that seems best at the moment, whereas dynamic programming solves problems by storing the results of overlapping subproblems and using them in the solution. Therefore, problems without overlapping subproblems are not solved using dynamic programming.
Dynamic programming
Optimal Substructure
Overlapping Subproblems
Fibonacci Sequence
Dijkstra's algorithm
Greedy algorithm
Optimal Substructure
Greedy Choice Pro
Dijkstra's algorithm
Divide-and-Conquer
Optimal Substructure
Merge Sort
Quick Sort
Dijkstra's algorithm is a case that applies to both dynamic programming and greedy algorithms. During a BFS (Breadth-First Search), it always finds the shortest path, embodying the greedy algorithm's property of making a locally optimal choice. At the same time, it stores already calculated paths for future use, solving overlapping subproblems, which is a characteristic of dynamic programming.

Methodology
Bottom-up approach: This starts from the smaller subproblems, using the solutions of these small problems to solve larger ones. Typically, this method is known as tabulation.
Top-down approach: This solves the problem in a natural manner, checking whether the solution to a subproblem has been calculated as it goes. This method is known as memoization.
Last updated