Correct, it is in O(n^2). Even more than that, it is in Theta(n^2).
Explanation
The only thing that matters is how often // operation is getting executed. The computations for the loop itself, like the indices are all in Theta(1), so we can ignore them.
Therefore, count how often // operation is executed. That is n * (3i + n - 2), like you said.
However, the i changes. So you actually need to fully write it out:

Simplify this:

Which clearly is in Theta(n^2).
Formal definition proof
You can easily show that with the formal definition. Therefore, let us call the function f. We have

so it is in O(n^2). And we have

yielding Omega(n^2). Together this means f in Theta(n^2).
I chose the constants randomly. You could make it tighter, but you only need to find one for which it holds, so it doesn't matter.
Limit definition proof
Of course we could also easily show it using the limit definitions:

Which is > 0 and < inf, so f in Theta(n^2) (see Wikipedia:Big-O-notation).
The table for the results is
< inf is O(g)
> 0 is Omega(g)
= 0 is o(g)
= inf is omega(g).
> 0 and < inf is Theta(g)
For the limits we used L'Hôpital's rule (see Wikipedia) which informally says:
The limit of f/g is the same as the limit of f'/g', supposed the limit even exists.
Note
The analysis assumed that // operation is in Theta(1), otherwise you will need to account for its complexity too, obviously.
O(n^2)looks like it is an upper bound for your loop. You can probably make the bound tighter with some math.Theta(n^2)(bound from both sides byn^2, it grows equally strong).3n((n+1)/2-1)+n^2which approaches5/2n^2