I have a 6000 lines long data file which I'm going to load up in buffer parse it and write to another json file. What is a better way to accomplish this task ? Should I load the file in buffer, then parse it , and then write it to the file ? Or should I load chunk of file in buffer, process it, and write it to the while keeping tasks simultaneously ? Is this close to async function in javascript ? Is there examples in python for simple file loading and writing to a file ?
1 Answer
You can use aiofiles:
async with aiofiles.open('filename', mode='r') as f:
async for line in f:
print(line)
They have good usage documentation in their GitHub repo.