2

I have a 6000 lines long data file which I'm going to load up in buffer parse it and write to another json file. What is a better way to accomplish this task ? Should I load the file in buffer, then parse it , and then write it to the file ? Or should I load chunk of file in buffer, process it, and write it to the while keeping tasks simultaneously ? Is this close to async function in javascript ? Is there examples in python for simple file loading and writing to a file ?

1 Answer 1

1

You can use aiofiles:

async with aiofiles.open('filename', mode='r') as f:
    async for line in f:
        print(line)

They have good usage documentation in their GitHub repo.

Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.