0
$\begingroup$

In the matrix chain multiplication (MCM) problem each time we apply a decision of parentizing an expresion $e=(e_1)(e_2)$ we have two subproblems to solve but they are not overlapping. Indeed, solving MCM for $e_1$ doesn't need something used to solve MCM for $e_2$ at some point and vice versa. On the other hand the problem of computing the fibonnaci numbers (FIB) $f(n)=f(n-1)+f(n-2)$ also requires in order to solve $f(n)$ to solve two subproblems $f(n-1)$ and $f(n-2)$ which are overlapping because they both rely on solving $f(n-3)$ for example. However, FIB is not even an optimization problem. Is there a classic problem which is an optimization problem, where like in the MCM, a decision leads to multiple subproblem but which are overlapping? In other words, I want the tree properties for the problem : optimization, one decision leads to multiple subproblem, the subproblems resulting from the decision are overlapping. For example, rod cutting is an optimization problem with overlapping subproblems but if at stage i we decide to cut off and sell a piece of size $l_i$, we are left just with one subproblem, namely the subproblem of best cutting the left part of size $l-l_i$.

$\endgroup$
1
  • $\begingroup$ Isn't the dynamical programming algorithm I'm general an example of this? $\endgroup$ Commented Jul 30, 2024 at 17:47

1 Answer 1

1
$\begingroup$

Contrary to what you wrote, matrix chain multiplication is already an example that meets your requirements, i.e., has overlapping subproblems. MCM does have overlapping subproblems. When asking for the best parenthesization of $ABCD$, at the top level we explore both $(A)(BCD)$ and $(ABC)(D)$, and then at the second level, both of these involve exploring the subproblem $BC$.

Another example is finding the longest common subsequence between two strings.

There is no objectively meaningful distinction between decision vs optimization problems. Any decision problem can be considered an optimization problem: the objective function should output 1 if the candidate solution is a correct solution, or 0 otherwise; then maximizing the objective function (an optimization problem) is equivalent to asking whether there exists any correct solution (a decision problem).

Consequently, one can create trivial examples that satisfy your requirements. For instance, $G_x(y) = 1$ if $y=\text{Fib}(x)$, or 0 otherwise; now given $x$, find the maximum of $G_x(\cdot)$. There is a dynamic programming algorithm that meets all of your requirements.

$\endgroup$
2
  • $\begingroup$ Thank you for pointing out that the MCM problem has overlapping subproblems! I guess an example of an optimization problem which has multiple subproblem for a decision which are not overlapping would be the Maximum Subarray Sum. We decide to cut the problem at a given index and we are left with two subproblems which are not overlapping that we then need to combine in a certain way. Would this divide and conquer still be considered as a special case of dynamic programming? $\endgroup$ Commented Jul 31, 2024 at 9:50
  • $\begingroup$ @edamondo, That's a matter of opinion, I suppose. I wouldn't worry about what it is called. I would focus more on practicing coming up with such algorithms. The idea of divide-and-conquer and dynamic programming is that they give you some patterns you can use to try to design an algorithm. $\endgroup$ Commented Jul 31, 2024 at 17:02

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.