If I have an array declared like this:
int a[3][2];
then why is:
sizeof(a+0) == 8
whereas:
sizeof(a) == 24
I don't understand how adding 0 to the pointer changes the sizeof output. Is there maybe some implicit type cast?
If you add 0 to a, then a is first converted to a pointer value of type int(*)[2] (pointing to the first element of an array of type int[3][2]). Then 0 is added to that, which adds 0 * sizeof(int[2]) bytes to the address represented by that pointer value. Since that multiplication yields 0, it will yield the same pointer value. Since it is a pointer, sizeof(a+0) yields the size of a pointer, which is 8 bytes on your box.
If you do sizeof(a), there is no reason for the compiler to convert a to a pointer value (that makes only sense if you want to index elements or to do pointer arithmetic involving the address of the elements). So expression a stays being of an array type, and you get the size of int[3][2] instead the size of int(*)[2]. So, 3 * 2 * sizeof(int) which on your box is 24 bytes.
Hope this clarifies things.
sizeof(a+0) is the same as doing sizeof(a[0]), which is taking the size of the first int[2] element in the array, so sizeof(int)*2 = 8. It is not taking the sizeof a pointer.a[0] is equivalent to *(a+0), not (a+0).