8

If I have an array declared like this:

int a[3][2];

then why is:

sizeof(a+0) == 8

whereas:

sizeof(a)   == 24

I don't understand how adding 0 to the pointer changes the sizeof output. Is there maybe some implicit type cast?

2 Answers 2

12

If you add 0 to a, then a is first converted to a pointer value of type int(*)[2] (pointing to the first element of an array of type int[3][2]). Then 0 is added to that, which adds 0 * sizeof(int[2]) bytes to the address represented by that pointer value. Since that multiplication yields 0, it will yield the same pointer value. Since it is a pointer, sizeof(a+0) yields the size of a pointer, which is 8 bytes on your box.

If you do sizeof(a), there is no reason for the compiler to convert a to a pointer value (that makes only sense if you want to index elements or to do pointer arithmetic involving the address of the elements). So expression a stays being of an array type, and you get the size of int[3][2] instead the size of int(*)[2]. So, 3 * 2 * sizeof(int) which on your box is 24 bytes.

Hope this clarifies things.

Sign up to request clarification or add additional context in comments.

3 Comments

Calling sizeof(a+0) is the same as doing sizeof(a[0]), which is taking the size of the first int[2] element in the array, so sizeof(int)*2 = 8. It is not taking the sizeof a pointer.
@Remy I would appreciate if you would formulate that theory and post it as a paper
@RemyLebeau-TeamB: a[0] is equivalent to *(a+0), not (a+0).
0

sizeof tells you the size of the type of the expression. When you add 0 to a, the type becomes a pointer (8 bytes on 64-bit systems).

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.