This is question from homework:
Implement a FIFO queue using two stacks.
The total running time of Enqueue and Dequeue functions should be O(n) in the worst case scenario. Also, analyze the running time of the algorithm.
What I did:
void Enqueue(T *value)
{
s1.Push(value);
}
T *Dequeue()
{
if (s2.size > 0)
return s2.Pop();
else if (s1.size > 0)
{
for (int i = 0; i < s1.size; i++)
s2.Push(s1.Pop());
return s2.Pop();
}
else return NULL;
}
Analysis of the algorithm:
Running time of one Enqueue is Theta(1). Total running time of the all Enqueue functions is n * Theta(1) = Theta(n).
Running time of Dequeue in worst case scenario is Theta(n) (when you call it after the last Enqueue, i.e. when all the items inserted). In all other cases the running time is Theta(1).
So, the total running time is: O(n) + O(n) + n * O(1) = 3 * O(n) = O(n)
Is this correct?
