I have a JSON file that is too big and I am reading that file to perform operations in different functions by calling my file load function eg: "read_file().
However, but in this way, performance is very poor, so I tried some other ways but don't work well. can anyone suggest any better way? please refer sample code below:-
def read_file(): # data loading function
with open(dump.json, 'r') as file:
data = json.load(file)
return data
def func1():
data = read_file()
for data_var in data['some_key']:
......some operations
def func2():
data = read_file()
for data_var in data['some_key']:
......some operations
def func3():
data = read_file()
for data_var in data['some_key']:
......some operations
def func4():
data = read_file()
for data_var in data['some_key']:
......some operations
def main():
func1()
func2()
func3()
func4()
main() # main call