1

I'm writing a function in NodeJS that hits an Url and retrieves its json. But I'm getting an error in JSON.parse: unexpected token.

In json validators the string is passing the test when I copy from the browser and paste into the text field, but when I paste the Url for the parser to get the json it show me an invalid message.

I guess it is something with the encoding of the response, but I can;t figure out what it is. Here if my function with an example Url.

function getJsonFromUrl(url, callback)
{
    url = 'http://steamcommunity.com/id/coveirao/inventory/json/730/2/';

    http.get(
        url
        , function (res) {
        // explicitly treat incoming data as utf8 (avoids issues with multi-byte chars)
        res.setEncoding('utf8');

        // incrementally capture the incoming response body
        var body = '';
        res.on('data', function (d) {
            body += d;
        });

        // do whatever we want with the response once it's done
        res.on('end', function () {
            console.log(body.stringfy());
            try {
                var parsed = JSON.parse(body);
            } catch (err) {
                console.error('Unable to parse response as JSON', err);
                return callback(err, null);
            }

            // pass the relevant data back to the callback
            console.log(parsed);
            callback(null, parsed);
        });
    }).on('error', function (err) {
        // handle errors with the request itself
        console.error('Error with the request:', err.message);
        callback(err, null);
    });
}

Can you help me, please?

Thanks in advance for any help.

1
  • Works just fine with your example URL (apart from body.stringfy() throwing an error). You're not checking if you actually get a JSON-response back from the server (by checking the Content-Type header), though. Commented Feb 5, 2016 at 15:16

1 Answer 1

2

Concatenating response as string might have issues with encoding e.g. if buffer of every chunk is converted to string with partial UTF-8 encodings at beginning or end. Thus I'd advise to concatenate as buffer first:

var body = new Buffer( 0 );
res.on('data', function (d) {
  body = Buffer.concat( [ body, d ] );
});

Of course it might help to explicitly convert buffer to string on your behalf rather than relying on JSON.parse() doing it implicitly. This might be essential in case of using unusual encoding.

res.on('end', function () {
  try {
    var parsed = JSON.parse(body.toString("utf8"));
  } catch (err) {
    console.error('Unable to parse response as JSON', err);
    return callback(err, null);
  }
        ...

Aside from that the content delivered by given URL seems to be pretty valid JSON.

Sign up to request clarification or add additional context in comments.

4 Comments

Using setEncoding() will make sure that multibyte sequences aren't cut off, as documented here.
@AndréLuiz As stated by robertklep this might not be causing your particular issue, however declaration is given in first example above in line 1. Buffer.concat() keeps replacing that buffer with a concatenation of previous buffer with received chunk.
Just worked. Thanks! And I just had to comment line res.setEncoding('utf8'); as robertklep told
@AndréLuiz Interesting case, for robertklep was eligibly argueing that res.setEncoding("utf-8") was preventing your initial approach from suffering by the issues I was presuming. So, whatever makes your script working is probably related to something else ... but who cares if it's working now.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.