0

I have seen similar questions yet not sure what do I need to change in my case to make it a valid json. I have a list of dictionaries like this:

dicts = [{'date': '2018-12-11',
  'target_currency': 'DKK',
  'exchange_rate': 7.4641,
  'base_currency': 'EUR'},
 {'date': '2018-12-11',
  'target_currency': 'GBP',
  'exchange_rate': 0.90228,
  'base_currency': 'EUR'},
...]

which I have converted to .json like this:

        with open('cleaned.json', 'w') as f:
            for d in dict2:
                json.dump(d, f)
                f.write(',') 

to get this error on https://jsonlint.com/.

{
    "date": "2018-12-11",
    "target_currency": "USD",
    "exchange_rate": 1.1379,
    "base_currency": "EUR"
}, {
    "date": "2018-12-11",
    "target_currency": "JPY",
    "exchange_rate": 128.75,
    "base_currency": "EUR"
}, {
    "date": "2018-12-11",
    "target_currency": "BGN",
    "exchange_rate": 1.9558,
    "base_currency": "EUR"



Error: Parse error on line 6:
...e_currency": "EUR"}, {   "date": "2018-1
----------------------^
Expecting 'EOF', got ','

When I tried to change comma for a new line separator, f.write('\n'), I got this error:

Error: Parse error on line 6:
..._currency": "EUR"} { "date": "2018-12-
----------------------^
Expecting 'EOF', '}', ',', ']', got '{'

Apparently I am writing into a file incorrectly. What am I doing wrong?

I have also tried writing it into a .jsonl (json lines) format with with open('cleaned.jsonl', 'w') and f.write('\n'). yet I still have

JSONDecodeError: Extra data: line 2 column 1 (char 98)

when I load it again using json.load().

4
  • 1
    No need to loop, just dump dicts. That is, if you want regular JSON array of JSON objects. If you want to create ndjson/jsonl - say it, in this case you don't need the comma. Commented Mar 28, 2022 at 12:31
  • it did not work. I tried with open('cleaned.json', 'w') as f: json.dump(dict2, f) f.write('\n') Commented Mar 28, 2022 at 12:35
  • with your suggestion, Error: Parse error on line 101: ...ge_rate": 7.8498 }, ----------------------^ Expecting 'STRING', 'NUMBER', 'NULL', 'TRUE', 'FALSE', '{', '[', got 'EOF' Commented Mar 28, 2022 at 12:38
  • however, with with open('.cleaned.jsonl', 'r') as f: json.dump(dict2, f) f.write('\n') it worked. so your solution only works for 'jsonl' . Commented Mar 28, 2022 at 12:41

1 Answer 1

1
import json

data = [{'date': '2018-12-11',
  'target_currency': 'DKK',
  'exchange_rate': 7.4641,
  'base_currency': 'EUR'},
 {'date': '2018-12-11',
  'target_currency': 'GBP',
  'exchange_rate': 0.90228,
  'base_currency': 'EUR'}]

with open('data.json', 'w') as f:
    json.dump(data, f, indent=4)


with open('data.jsonl', 'w') as f:
    for item in data:
        f.write(json.dumps(item) + '\n')

output data.json:

[
    {
        "date": "2018-12-11",
        "target_currency": "DKK",
        "exchange_rate": 7.4641,
        "base_currency": "EUR"
    },
    {
        "date": "2018-12-11",
        "target_currency": "GBP",
        "exchange_rate": 0.90228,
        "base_currency": "EUR"
    }
]

This you can read as a whole with json.load().

output data.jsonl:

{"date": "2018-12-11", "target_currency": "DKK", "exchange_rate": 7.4641, "base_currency": "EUR"}
{"date": "2018-12-11", "target_currency": "GBP", "exchange_rate": 0.90228, "base_currency": "EUR"}

Note that ndjson/jsonl you cannot read with json module as one file, you need to read/parse line by line. There are third party libraries to handle ndjson/jsonl conveniently. Also, you cannot validate ndjson/jsonl on https://jsonlint.com

Sign up to request clarification or add additional context in comments.

2 Comments

yes, thats fine. I can open either with json.load() or .read().splitlines(). thank you v much.
In case of jsonl - no need to read the whole content, just iterate over file handler.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.