0

I have a JSON file that is too big and I am reading that file to perform operations in different functions by calling my file load function eg: "read_file().

However, but in this way, performance is very poor, so I tried some other ways but don't work well. can anyone suggest any better way? please refer sample code below:-

def read_file(): # data loading function
    with open(dump.json, 'r') as file:
        data = json.load(file)
    return data

def func1():
    data = read_file()
    for data_var in data['some_key']:
        ......some operations

def func2():
    data = read_file()
    for data_var in data['some_key']:
        ......some operations

def func3():
    data = read_file()
    for data_var in data['some_key']:
        ......some operations


def func4():
    data = read_file()
    for data_var in data['some_key']:
        ......some operations

def main():
    func1()
    func2()
    func3()
    func4()

main() # main call
2
  • maybe like this: stackoverflow.com/a/32661888/10197418 Commented Jun 6, 2020 at 11:10
  • My concern is a little different as I have a separate function for loading JSON data(read_file()) and that function has to return complete rows for reading. The only concern is that in each action functions eg: Func1, Func2, Func3 ...,......,........etc, I have to write data = read_file() to load the data for reading. so its line repetition and very slow as well, thinking if there are any better ways to avoid line repetition and also yield the data to make it faster. Commented Jun 6, 2020 at 11:48

0

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.