This is an interesting problem. It appears that you have multiple JSON files just concatenated together in a file and want to split them. Since the python JSON library only works with complete JSON files and does not accept multiples, it is not sufficient on its own.
Fortunately, Python has really great error handling that you can take advantage of here.
def json_split(text):
'''Split text into JSON files and yield each JSON object.'''
while text:
try:
yield json.loads(text)
return
except json.JSONDecodeError as error:
yield json.loads(text[:error.pos])
text = text[error.pos:].strip()
Then:
with open('some_file.json') as file:
text = file.read()
for json_object in json_split(text):
print('json_object = ' + repr(json_object))
The function json_split works by trying to parse a JSON object, which then fails if there are more than one JSON objects. Fortunately, the JSONDecodeError that results when it fails tells us exactly which position the next JSON object starts at (pos), so we just split the string there, parse the JSON, and then rinse and repeat.
There are more efficient ways to do this (e.g. using virtual files or views from the io module to cut down on memory copying), but the above will work well for short collections of files.
Edit: Also, to write the files back out, just use json.dump, example below.
for v in json_split(text):
with open(str(v['tstp']) + '.json', 'w') as file:
json.dump(v, file)