0

I have a python code for extracting balance sheet reports in a loop for multiple locations through API get request. I have set up an else statement to return all the location ID's that fetch me no JSON data.

Sometimes the loop works till the end until it get's the final report. But most of the times the code throws the below error and stops running:

Traceback (most recent call last):

  File "<ipython-input-2-85715734b89c>", line 1, in <module>
    runfile('C:/Users/PVarimalla/.spyder-py3/temp.py', wdir='C:/Users/PVarimalla/.spyder-py3')

  File "C:\Users\PVarimalla\Anaconda3\lib\site-packages\spyder_kernels\customize\spydercustomize.py", line 827, in runfile
    execfile(filename, namespace)

  File "C:\Users\PVarimalla\Anaconda3\lib\site-packages\spyder_kernels\customize\spydercustomize.py", line 110, in execfile
    exec(compile(f.read(), filename, 'exec'), namespace)

  File "C:/Users/PVarimalla/.spyder-py3/temp.py", line 107, in <module>
    dict1 = json.loads(json_data)

  File "C:\Users\PVarimalla\Anaconda3\lib\json\__init__.py", line 348, in loads
    return _default_decoder.decode(s)

  File "C:\Users\PVarimalla\Anaconda3\lib\json\decoder.py", line 337, in decode
    obj, end = self.raw_decode(s, idx=_w(s, 0).end())

  File "C:\Users\PVarimalla\Anaconda3\lib\json\decoder.py", line 355, in raw_decode
    raise JSONDecodeError("Expecting value", s, err.value) from None

JSONDecodeError: Expecting value

For example, Perfect Run: throws 15 locations id's out of 50 locations that the it couldn't JSON data for and I have dataframe with all the other franchisees balance sheets appended.

Incorrect Runs: Each time I run the script it throws 5 (or) 6 (or) 3 locations id's that it couldn't fetch JSON data and stops running with the above error.

I don't understand why does the script runs perfectly sometimes and behaves weirdly rest of the times(most of the times). Is it because of internet connection or an issue with Spyder 3.7?

I think I have no error in my whole script but unsure why I'm facing the above issue. Please help me with this.

Below is the code:

import requests
import json
#import DataFrame
import pandas as pd
#from pandas.io.json import json_normalize
#import json_normalize
access_token = 'XXXXXXXXX'
url = 'https://api.XXXX.com/v1/setup'
url_company = "https://api.*****.com/v1/Reporting/ProfitAndLoss?CompanyId=1068071&RelativeDateRange=LastMonth&DateFrequency=Monthly&UseAccountMapping=true&VerticalAnalysisType=None"
url_locations_trend = "https://api.*****.com/v1/location/search?CompanyId=1068071"
url_locations_mu = "https://api.*****.com/v1/location/search?CompanyId=2825826"
url_locations_3yrs = "https://api.qvinci.com/v1/location/search?CompanyId=1328328"
ult_result = requests.get(url_locations_trend,
      headers={
               'X-apiToken': '{}'.format(access_token)})


#decoded_result= result.read().decode("UTF-8")
json_data_trend = ult_result.text
dict_trend = json.loads(json_data_trend)

locations_trend = {}

#Name
locations_trend["Name"] = []
for i in dict_trend["Items"]:
    locations_trend["Name"].append(i["Name"])

#ID
locations_trend["ID"] = []
for i in dict_trend["Items"]:
    locations_trend["ID"].append(i["Id"])

#creates dataframe for locations under trend transformations
df_trend = pd.DataFrame(locations_trend)

#making a call to get locations data for under 3 yrs
ul3_result = requests.get(url_locations_3yrs,
      headers={
               'X-apiToken': '{}'.format(access_token)})

#decoded_result= result.read().decode("UTF-8")
json_data_3yrs= ul3_result.text
dict_3yrs = json.loads(json_data_3yrs)

locations_3yrs = {}

#Name
locations_3yrs["Name"] = []
for i in dict_3yrs["Items"]:
    locations_3yrs["Name"].append(i["Name"])

#ID
locations_3yrs["ID"] = []
for i in dict_3yrs["Items"]:
    locations_3yrs["ID"].append(i["Id"])

#creates dataframe for locations under 3 yrs  
df_3yrs = pd.DataFrame(locations_3yrs)

#making a call to get locations data for under 3 yrs
ulm_result = requests.get(url_locations_mu,
      headers={
               'X-apiToken': '{}'.format(access_token)})

#decoded_result= result.read().decode("UTF-8")
json_data_mu = ulm_result.text
dict_mu = json.loads(json_data_mu)

locations_mu = {}

#Name
locations_mu["Name"] = []
for i in dict_mu["Items"]:
    locations_mu["Name"].append(i["Name"])

#ID
locations_mu["ID"] = []
for i in dict_mu["Items"]:
    locations_mu["ID"].append(i["Id"])

#creates dataframe for locations under 3 yrs  
df_mu = pd.DataFrame(locations_mu)


locations_df = pd.concat([df_mu, df_3yrs, df_trend])

df_final = pd.DataFrame()
count = 0
for i in locations_df["ID"]:
    if count < 3:
        url_bs = "https://api.******.com/v1/Reporting/BalanceSheet?DateFrequency=Monthly&UseAccountMapping=true&VerticalAnalysisType=None&IncludeComputedColumns=true&RelativeDateRange=LastTwoCYTD&UseCustomDateRange=false&CompanyId=2825826&Locations=" + i
    elif 2 < count < 12:
        url_bs = "https://api.******.com/v1/Reporting/BalanceSheet?DateFrequency=Monthly&UseAccountMapping=true&VerticalAnalysisType=None&IncludeComputedColumns=true&RelativeDateRange=LastTwoCYTD&UseCustomDateRange=false&CompanyId=1328328&Locations=" + i
    else :
        url_bs = "https://api.******.com/v1/Reporting/BalanceSheet?DateFrequency=Monthly&UseAccountMapping=true&VerticalAnalysisType=None&IncludeComputedColumns=true&RelativeDateRange=LastTwoCYTD&UseCustomDateRange=false&CompanyId=1068071&Locations=" + i

    result = requests.get(url_bs,
          headers={
                   'X-apiToken': '{}'.format(access_token)})
    #decoded_result= result.read().decode("UTF-8")

    json_data = result.text
    if(json_data != ""): 
        final = {}

        dict1 = json.loads(json_data)

        final["Months"] = dict1["ReportModel"]["ColumnNames"]
        final["Location"] = [dict1["SelectedOptions"]["Locations"][0]]*len(final["Months"])
        set = {"Total 10000 Cash","Total 12000 Inventory Asset","Total Other Current Assets","Total Fixed Assets","Total ASSETS",
               "Total Accounts Payable","Total Credit Cards","24004 Customer Deposits","Total Liabilities","Total Equity","Total Long Term Liabilities"}

        def search(dict2):
            if len(dict2["Children"]) == 0:
                return
            for i in dict2["Children"]:
                if(i["Name"] in set):
                    final[i["Name"]] = []
                    for j in i["Values"]:
                        final[i["Name"]].append(j["Value"])
                search(i)

            if ("Total " + dict2["Name"]) in set:
                final["Total " + dict2["Name"]] = []
                for j in dict2["TotalRow"]["Values"]:
                    final["Total " + dict2["Name"]].append(j["Value"])
            return

        for total in dict1["ReportModel"]["TopMostRows"]:
            search(total)

        df_final = pd.concat([df_final,pd.DataFrame(final)], sort = False)
    else: print(i)
    count = count + 1

#exporting dataframe to pdf    
#df_final.to_csv(, sep='\t', encoding='utf-8')
df_final.to_csv('file1.csv')

Thank you.

6
  • always put full error message (starting at word "Traceback") in question (not comment) as text (not screenshot). There are other useful information. Commented Oct 22, 2019 at 14:41
  • 1
    I think I have no error in my whole script Without seeing the script, there's really no way for us to help... Commented Oct 22, 2019 at 14:43
  • maybe API gets so many requests from all people on the world so it has no time to answer for all of them. You should check if you get data before you try to use it. Commented Oct 22, 2019 at 14:44
  • I have pasted the whole code and the error that I'm receiving @furas Commented Oct 22, 2019 at 14:54
  • 1
    BTW: requests has .json() - dict_mu = ult_result.json() - so you don't need json.loads(). But first you should check if ult_result.text is not empty because you may not get answer from server and then you can't get json data - use if not ult_result.text: exit(1) or try/except - 'try: dict_mu = ult_result.json() except: exit(1)` Commented Oct 22, 2019 at 15:27

1 Answer 1

1

You should post the code and the entire exception for a more accurate answer. However it seems to me that the API eventually is not returning a JSON (you could, for example, be making to many request at a very short period, so the API returns a 404)

Try priting/logging the API response before decoding to verify this.

EDIT: Given the feedback, setting a interval between each iteration should resolve your issue. You can use time.sleep(0.5) inside the for loop. (remember to add import time)

You should also consider using try/except in your code so you can handle exceptions more broadly.

Sign up to request clarification or add additional context in comments.

4 Comments

I have pasted the whole code and the error that I'm receiving @renatodvc
@PraneethReddy from your traceback it's the dict1 = json.loads(json_data) that caused the exception. Given that you loop with no pause through a unknown number of GET requests, I still suspect that you are exausting the API and not getting the response you expect. Place a print(json_data) above the line that assings dict1, to see what is being passed to json.loads().
Thank you. I think you're right there are lot of GET requests happening in the loop. As the error says there's JSON data available but each call has to be made with a gap of 500 milliseconds. So, I think I need to set a timer of 1 second or greater than 500 milliseconds for each GET request inside the loop. What do you think?
@PraneethReddy I edited my answer with a suggestion for you. If my answer solved your issue, please consider accepting it (clicking in the checkmark)

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.