I need to parse file (~500 Mb) and partially load it to list, I don't need the entire file.
I had a feeling that python allocate much more memory for the list that the size of the data it contains.
I tried to use asizeof of pympler in order to estimate the overkill however it fails with MemoryError which is strange for me, I thought if I have a list in the memory asizeof should just run over it sum the sizes of all entities and that it.
Then I took the chunk of the initial file, and I was shocked by the size of the list asizeof showed me. The list size was three times bigger that the file size.
The question is if the size given by asizeof is correct, what the more efficient way to use list in python. How to check the size of the bigger list when asizeof fails with memoryerror.