What's a good strategy to determine the running time (Big O notation) of data structures and algorithms. I have the following to figure out the running times for and I'm having trouble determining what it would be.
AINC is an array containing n integers arranged in increasing order.
AD is an array containing n integers arranged in decreasing order.
AR is an array containing n integers in random order.
Q is a queue implemented as a linked list and containing p elements.
LINK is a linked list containing n nodes.
CIRC is a circular linked list containing n elements, where C points to the last element.
T is a binary search tree containing n nodes.
a) Searching for an element in AINC using linear search.
b) Deleting the 10thnode of linked list LINK.
c) Calling a function which uses Q, and calls dequeue m times.
d) Inserting an element at the end of the list CIRC.
e) Deleting the last element of CIRC.
f) Finding the largest element of T.
g) Determining the height of T.
h) Making the call selectionsort (AINC, n).
i) Making two calls one after another. The first call is mergesort(AD,n), followed by the call insertionsort(AD,n).
j) Converting a decimal integer num into its binary equivalent.
***This is not hw. I am preparing for an exam.