For the code to compile, you've to change the last statement to display(&a[0][0]). Reason is array decay is not applied repeatedly, so int[3][4] becomes int(*)[4]. Once you make the code compilable, you'll see the problem you've asked about: increment by 1 showing different values for pointers a and p.
Although the actual value of both a and p would be the same base address of the array / first value location (call it X), it is to be noted that both pointers are of different types; p is of type int*, while int (*) [4] would be the type of the pointer to the 2-dimensional array (as array decay applies only to the first dimension. In pseudocode
p + 1 = X + sizeof(int)
a + 1 = X + sizeof(int (*) [4]) = X + (sizeof(int) * 4)
where 4 is the second dimension of the array. Hence, on a machine where sizeof(int) is 4, you'd see X + 4 for p and X + 16 for a. On my machine, p + 1 = 0x22fe34, while a + 1 = 0x22fe40.
sizeof(int)(probably 4) fromdisplay()and you should get a difference of4*sizeof(int)(probably 16) from the code inmain(). This is becauseaina+1is of typeint (*)[4]or pointer to array of 4int.