I'm working on a problem where A is being decremented by the value of k in every iteration. K is also doubled in every iteration. So for example:
A = 30 -> 29 --> 27 --> 23 --> 15 --> 0 delta = 1 --> 2 --> 4 --> 8 --> 16
How can I assess the time complexity of this algorithm? I figure there has to be related to O(logN) due to the doubling of delta, but not sure how to intuitively/mathematically arrive at a conclusion.