0

The stream response is form of

[{
  "id":0,
  "name":name0
}
,
{
  "id":1,
  "name":name1
}
]

if I use node-fetch stream feature to fetch, iterate the response.body, the chunk data is cut the object randomly. And I can't parse on it. I guess node-fetch not support array of json, and can't recognize [, ].

How to process the streaming array of json? Or any other 3rd party library? sample code:

const fetch = require('node-fetch');

async function main() {
  const response = await fetch(url);
  try {
    for await (const chunk of response.body) {
      console.log('----start')
      console.dir(JSON.parse(chunk.toString()));
      console.log('----end')}
  } catch (err) {
    console.error(err.stack);
  }
}

main()
0

1 Answer 1

1

An approach to stream parse an external JSON source is to combine node-fetch with stream-json to parse the incoming data regardless of how the (string) data is chunked.

import util from "util";
import stream from "stream";
import StreamArray from "stream-json/streamers/StreamArray.js";
import fetch from "node-fetch";

const response = await fetch(url);

await util.promisify(stream.pipeline)(
  response.body,
  StreamArray.withParser(),
  async function( parsedArrayEntriesIterable ){
    for await (const {key: arrIndex, value: arrElem} of parsedArrayEntriesIterable) {
      console.log("Parsed array element:", arrElem);
    }
  }
)

stream.pipeline() with async function require NodeJS >= v13.10

Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.