I'm working on proving a greedy solution to the problem in Codeforces 205B, which states:
Given an array of integers
aof lengthn, determine the minimum number of operations needed to make the array sorted (non-decreasing). Operation is defined as: pick somel,rand increase alla[i]inl <= i <= rby 1.1 <= a[i] <= 1e9
The pseudo-code of the greedy solution is as follows:
total = 0;
for (int i = 1; i < n; i++) {
if (a[i - 1] > a[i]) {
total += (a[i - 1] - a[i]);
}
}
return total;
This greedy approach models the following: when we encounter an a[i] that is less than a[i-1], we perform the operation on the range [i, n-1] a[i-1] - a[i] times.
Example: [10, 1, 5, 2] -> [10, 10, 14, 11] -> [10, 10, 14, 14].
I understand this approach produces a valid answer, because it pays the minimum needed to fix every array element that is less than its predecessor, and the relative differences of future elements are not changed by increasing the rest of the array by the same value. I'm having trouble formally proving that this is the minimal approach. How do I show that this approach is minimal - there is no other series of operations that can achieve a sorted array in fewer operations?