Dodatkowe przykłady dopasowywane są do haseł w zautomatyzowany sposób - nie gwarantujemy ich poprawności.
They are ideal only for problems which have 'optimal substructure'.
A slightly more formal definition of optimal substructure can be given.
In computer science, a problem that can be broken apart like this is said to have optimal substructure.
Such optimal substructures are usually described by means of recursion.
Such an example is likely to exhibit optimal substructure.
Consequently, the first step towards devising a dynamic programming solution is to check whether the problem exhibits such optimal substructure.
It is applicable to problems exhibiting the properties of overlapping subproblems and optimal substructure (described below).
There are two key attributes that a problem must have in order for dynamic programming to be applicable: optimal substructure and overlapping subproblems.
This problem exhibits optimal substructure.
Optimal substructure means that the solution to a given optimization problem can be obtained by the combination of optimal solutions to its subproblems.
In computer science, a problem is said to have optimal substructure if an optimal solution can be constructed efficiently from optimal solutions of its subproblems.
Note that the problem of finding the shortest addition chain cannot be solved by dynamic programming, because it does not satisfy the assumption of optimal substructure.
"A problem exhibits optimal substructure if an optimal solution to the problem contains optimal solutions to the sub-problems."
As an example of a problem that is unlikely to exhibit optimal substructure, consider the problem of finding the cheapest airline ticket from Buenos Aires to Moscow.
Typically, a greedy algorithm is used to solve a problem with optimal substructure if it can be proved by induction that this is optimal at each step (Cormen et al. pp.
If minimizing the local functions is a problem of "lower order", and (specifically) if, after a finite number of these reductions, the problem becomes trivial, then the problem has an optimal substructure.
The LCS problem has an optimal substructure: the problem can be broken down into smaller, simple "subproblems", which can be broken down into yet simpler subproblems, and so on, until, finally, the solution becomes trivial.
Because of the way this algorithm uses optimal substructures (the maximum subarray ending at each position is calculated in a simple way from a related but smaller and overlapping subproblem, the maximum subarray ending at the previous position) this algorithm can be viewed as a simple example of dynamic programming.
When a problem shows optimal substructure, meaning the optimal solution to a problem can be constructed from optimal solutions to subproblems, and overlapping subproblems, meaning the same subproblems are used to solve many different problem instances, a quicker approach called dynamic programming avoids recomputing solutions that have already been computed.