Recently, I reviewed dynamic programming and greedy algorithms and summarized them with the help of GPT.
The Difference Between Dynamic Programming and Greedy Algorithms#
In general, if a problem satisfies the properties of optimal substructure and overlapping subproblems, dynamic programming can be considered. If a problem satisfies the property of optimal substructure and can achieve the global optimal solution through local optimal solutions (greedy choice property), greedy algorithms can be considered. If a problem satisfies all of the above characteristics, both dynamic programming and greedy algorithms can be used, and the specific choice depends on the requirements and constraints of the actual problem.
 Overlapping Subproblems
Dynamic programming is usually used in cases where there are a large number of overlapping subproblems. In dynamic programming, once the solution to a subproblem is calculated, it is stored so that it can be directly used when the problem needs to be calculated again. In contrast, greedy algorithms do not consider overlapping subproblems.  Optimal Substructure
Both dynamic programming and greedy algorithms require the problem to have the property of optimal substructure. This means that the optimal solution to the original problem can be obtained from the optimal solutions of its subproblems.  Greedy Choice Property
Greedy algorithms rely on the greedy choice property, which means that a locally optimal choice can lead to a globally optimal solution. Greedy algorithms make the best choice at each decision stage, regardless of the consequences for future decisions. On the other hand, dynamic programming evaluates all possible options and chooses the optimal one.  Approach to Problem Solving
Greedy algorithms are usually iterative methods, where the current optimal solution is chosen at each step without considering previous or future decisions. Dynamic programming usually requires creating a table to store and look up all subproblem solutions.
How to Determine Whether to Use Dynamic Programming or Greedy Algorithms to Solve a Problem?#
 Determine if the problem is an optimization problem
For example, problems like finding the shortest path or maximizing profit may require dynamic programming or greedy algorithms.  Look for the property of "optimal substructure"
Optimal substructure means that the optimal solution to the problem can be composed of optimal solutions to its subproblems. If such a structure exists, dynamic programming or greedy algorithms may be suitable for solving the problem.  For dynamic programming, also consider if there are "overlapping subproblems"
This refers to the situation where multiple problems share the same subproblem, and repeated calculations will result in low efficiency. This situation is often suitable for dynamic programming.  Determine if the problem satisfies the "greedy choice property"
This means that a locally optimal choice can lead to a globally optimal solution. In other words, you can make the best choice step by step without considering the overall situation. If this property exists, greedy algorithms may be suitable.  Consider the complexity of the problem and the required solution speed
Greedy algorithms are usually simpler and faster, but they do not always find the optimal solution to the problem. On the other hand, although dynamic programming may be more complex and slower, it is more likely to find the optimal solution to the problem.  Consider the scale of the problem
For largescale problems, dynamic programming may require a large amount of memory to store subproblem solutions, which may not be feasible.
How to Break Down Subproblems?#
Breaking down subproblems is a key step in solving dynamic programming and greedy algorithm problems.
Remember, breaking down subproblems requires practice and experience. Different problems may require different methods to break down subproblems, which takes time and patience to learn.
 Determine the optimal substructure
Try to find a smaller, simplified version or subset of the problem, and the solution to this subproblem should be similar to the original problem. The optimal solution to this subproblem will help us build the optimal solution to the original problem.  Define the subproblem
You need to clearly define what the subproblem is. For example, in the shortest path problem, the subproblem may be the shortest path from the starting point to a certain intermediate point.  State transition equation
This equation expresses how the subproblems are related to each other. The state transition equation will tell you how to derive the solution to the current subproblem from the solved subproblems.  Boundary conditions
Find the simplest subproblems whose solutions are obvious. For example, in the Fibonacci sequence problem, the initial two numbers are the boundary conditions.  Relationship between subproblems
In dynamic programming, subproblems often overlap, meaning that some subproblems will reappear when solving other subproblems. In this case, the answers to the already solved subproblems can be used to avoid redundant calculations.
5 Typical Dynamic Programming Problems#

 Fibonacci Sequence: Given n, calculate the nth term of the Fibonacci sequence.

 Rod Cutting: Given a list of lengths and prices, find the maximum total value by cutting the rod.

 Longest Common Subsequence: Given two strings, find the longest common subsequence.

 01 Knapsack Problem: Given a set of items, each with a weight and value, determine the maximum total value while not exceeding the weight limit of the knapsack.

 Matrix Chain Multiplication: Given a chain of matrices, determine the optimal order of multiplication to minimize the total number of multiplications.
5 Typical Greedy Algorithm Problems#

 Activity Selection Problem: Given a set of activities, each with a start time and end time, select the maximum number of activities that do not overlap in time.

 Huffman Coding: An algorithm used for data compression, which requires constructing an optimal Huffman tree.

 Dijkstra's Algorithm: Used to find the shortest path in a graph from a single source.

 Fractional Knapsack Problem: Similar to the 01 knapsack problem, but items can be divided, and we need to determine how to select and divide items to maximize the total value.

 Kruskal's Algorithm and Prim's Algorithm: Used to find the minimum spanning tree of a graph.