I have written a basic example of dynamic programming in Java (shown below) which solves the Longest Increasing Subsequence problem (https://en.wikipedia.org/wiki/Longest_increasing_subsequence).
The function works but for a homework assignment, I am trying to find out the time complexity of this algorithm vs its non-dynamic equivalent.
I believe dynamic version is O(n^2) but for the non-dynamic equivalent I am very confused. I have tried and failed to write a non-dynamic version, but I think it would be comprised of a recursive call within nested (2) for loops. Would that imply exponential time complexity? Or even factorial time complexity?
I would be very grateful for any help in cracking this complexity conundrum or even producing a non-dynamic, recursive equivalent of the function I have written below.
Thanks in advance!
public static int longest(int[] array) {
int n = array.length;
int[] results = new int[n];
for(int i = 0; i < n; i++) {
results[i] = -1;
}
int max = 1;
for(int j = 1; j <= n; j++) {
int current = memoized_longest(array, j, results);
if(current > max) {
max = current;
}
}
return max;
}
public static int memoized_longest(int[] array, int n, int[] results) {
if(results[n-1] >= 0) {
return results[n-1];
}
if(n == 1) {
results[n-1] = 1;
return results[n-1];
}
int q = 1;
for(int i = n - 1; i >= 0; i--) {
if(array[i] < array[n - 1]) {
q = Math.max(q, 1 + memoized_longest(array, i+1, results));
}
}
results[n-1] = q;
return q;
}