0

Tried solution shared in link :: Nested json to csv - generic approach

This worked for Sample 1 , but giving only a single row for Sample 2. is there a way to have generic python code to handle both Sample 1 and Sample 2.

Sample 1 ::

{
    "Response": "Success",
    "Message": "",
    "HasWarning": false,
    "Type": 100,
    "RateLimit": {},
    "Data": {
        "Aggregated": false,
        "TimeFrom": 1234567800,
        "TimeTo": 1234567900,
        "Data": [
            {
                "id": 11,
                "symbol": "AAA",
                "time": 1234567800,
                "block_time": 123.282828282828,
                "block_size": 1212121,
                "current_supply": 10101010
            },
            {
                "id": 12,
                "symbol": "BBB",
                "time": 1234567900,
                "block_time": 234.696969696969,
                "block_size": 1313131,
                "current_supply": 20202020
            },
        ]
    }
}

Sample 2::

{
    "Response": "Success",
    "Message": "Summary succesfully returned!",
    "Data": {
        "11": {
            "Id": "3333",
            "Url": "test/11.png",
            "value": "11",
            "Name": "11 entries (11)"
        },
        "122": {
            "Id": "5555555",
            "Url": "test/122.png",
            "Symbol": "122",
            "Name": "122 cases (122)"
        }
     },
    "Limit": {},
    "HasWarning": False,
    "Type": 50
}
5
  • I haven't looked in too close detail to sample 2, but doesn't pandas normalize JSON work for sample 2? Then you can push it into a dataframe and export into any format you like. Commented Feb 28, 2022 at 7:52
  • json_normalize does not work out of the box if you have lists somewhere in JSON Commented Feb 28, 2022 at 8:15
  • Agreed with Krishna, direct json_normalize doesn't support generically nested JSON files. Commented Feb 28, 2022 at 11:22
  • Tried using shared code, which works well for Sample1 data , but its generates single row as output with this generic code for Sample2 data Commented Feb 28, 2022 at 11:24
  • What is the error you are getting Commented Mar 2, 2022 at 4:10

1 Answer 1

0

Try this, you need to install flatten_json from here

import sys
import csv
import json
from flatten_json import flatten


data = json.load(open(sys.argv[1]))
data = flatten(data)

with open('foo.csv', 'w') as f:
    out = csv.DictWriter(f, data.keys())
    out.writeheader()
    out.writerow(data)

Output

> cat foo.csv
Response,Message,Data_11_Id,Data_11_Url,Data_11_value,Data_11_Name,Data_122_Id,Data_122_Url,Data_122_Symbol,Data_122_Name,Limit,HasWarning,Type
Success,Summary succesfully returned!,3333,test/11.png,11,11 entries (11),5555555,test/122.png,122,122 cases (122),{},False,50

Note: False is incorrect in Json, you need to change it to false

Sign up to request clarification or add additional context in comments.

7 Comments

getting this error while trying to use flatten_json "UnicodeDecodeError: 'charmap' codec can't decode byte 0x8d in position 1001498: character maps to <undefined>". Looking for some generic code that can handle Sample data 1 and Sample data 2 both. flatten_json is unable to handle sample code 1
It's not because of the library. It's the encoding of the file you have issues with
HI Krishna , tried using utf8 , utf16, cp437 along with errors='ignore' as well. But still looking for a generic code eg like :: stackoverflow.com/questions/37706351/… which can handle both Sample Code 1 and Sample code 2 to convert from nested JSON file to CSV.
and even if we try to take sample code 2 in a string of data and apply the solution provided, it creates only a single row as output, we should be getting 2 rows as output of Sample code 2 JSON data.
Please share your expected output in each case
|

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.