I was just checking the size of some datatypes in Python 3 and I observed this.
import sys
val = None
print(sys.getsizeof(val))
The output was 16 as expected.
I tried making a list of 1000 locations of None and I expected the size to be 16*1000 = 16000 or more. But the result I got was different.
import sys
val = [None]*1000
print(sys.getsizeof(val))
The output was 8064. Nearly the half of the size I expected.
What is the reason for this.? Why memory allocated is less.?
sys.getsizeofreturns only the size of the container (so, the container overhead and for the list the array of Py_Object pointers, on a 64-bit system, 8 bytes per pointer), not the objects inside the list. In this case, however, sinceNoneis a singleton, the total size is simplysys.getsizeof(val) + sys.getsizeof(None). In general, if a list references all unique objects, you have to getsys.getsizeof(my_list) + sum(map(sys.getsizeof, my_list)). Often, it is between these two extremes.