in wikipedia they state under the Heading: "running time" for Dijstras algorithm is
O(|V|^2 + |E|*decreaseKey) = O(|V|)
This is my first time analyzing time complexity for an algoritm, but in my opinion it should be:
- Outerloop (while Q is not empty) will take O(|v|) time
- Findmin in an array should take O(|v| )
- Once Min is found then examine all neighbors which can take O(|E|)
- then updating they keys for these neigbors is done in constant time C
so we will have
O(V)*( O(|V|) + O(|E|)C ) = O(|V|^2 + O(|V||E|* C) ) = O(|V|^2)
my question is about the term:
O(|V|*|E|*C)
In Wikipedia they don't even bother the |V| factor at all why? is my analysis wrong?
|V|*|E|is inaccurate. Each edge is only visited at most twice (when each of its endpoints are visited), so2*|E|is more accurate, which is justO(|E|).|E|in a simple graph is at most the number of vertices in the complete graph of|V|vertices, which is(|V|^2 - |V|)/2 = O(|V|^2).