I have trouble finding the complexity of the following algorithm:
int count = 0;
for (int i = 0, i <n -1; i++) {
for (int j = i +1; j < n; j++) {
for (int k = n; k > n/2; k--) {
count++;
}
}
}
int j = 0;
int num = count;
while(j < num) {
for (int k = 0; k < 10; k++) {
count++;
}
j++;
}
I got O(𝑛) for the worst case. Is this right or did I mess up?
If someone could explain it would be great, maybe with what operations are happening per line. I'm getting really confused in the first part, since there are multiple nested for loops.