I am working on a project where I'm reading as many as 250000 items or more in a list and converting each of it's entries as key to a hash table.
sample_key = open("sample_file.txt").readlines()
sample_counter = [0] * (len(sample_key))
sample_hash = {sample.replace('\n', ''):counter for sample, counter in zip(sample_key, sample_counter)}
This code works well when len(sample_key) is in the range 1000-2000. Beyound that it simply ignores processing any further data.
Any suggestions, how can I handle this large list data?
PS: Also, If there is an optimal way to perform this task(like reading directly as a hash key entry), then please suggest. I'm new to Python.
collections.Counter