0

I'm looping through a number of image uri's to download them locally.

This is the code I'm using:

const optionsImg = {
url: basePath,
   headers: {
       'User-Agent': 'request'
   }
};

let download = function(url, filename, callback) {
   optionsImg.url = url
   request.head(optionsImg, (err, res, body) => {
      console.log('content-type:', res.headers['content-type']);
      console.log('content-length:', res.headers['content-length']);

      request(optionsImg).pipe(fs.createWriteStream(filename)).on('close', callback);
   });
};

But it's only ever saving the last image in the list. What I suspect is happening is it's trying to download async, and the fs.createWriteStream keeps being interrupted, till the list has finished. Therefore only downloading the last image successfully.

The console.logs do show different content-length.

How best to get round this issue? Thanks in advance.

Edit: This is the loop code:

for (let x = 0; x < pics.length; x++) {
    download(pics[x], localPath + x + '.jpeg', function() {
        console.log('done');
    });
}
8
  • Why don't you use multer? Commented Jun 18, 2021 at 11:42
  • 1
    Are you saving every file to a different filename? Commented Jun 18, 2021 at 11:46
  • 1
    Seems like you are overriding what you are saving Commented Jun 18, 2021 at 11:47
  • 1
    You can create multiple streams for read and pipe them to write stream. You should check this. Commented Jun 18, 2021 at 11:53
  • 1
    Yeah. That looks like the very thing. Thanks dude. Commented Jun 18, 2021 at 11:55

1 Answer 1

2

By the time request.head has finished—which, because it is a network request it might as well be "an eternity" as far as a computer is concerned—optionsImg.url will always have the same value because you keep re-using the same object.

Consider this:

const obj = { a: undefined };
for(let i = 0; i <= 3; i++) { 
  obj.a = i; 
  setTimeout(console.log, 100, obj);   
}
// will log {a:3} always

You'd fix the above problem by cloning the object:

const obj = { a: undefined };
for(let i = 0; i <= 3; i++) { 
  const obj2 = Object.assign({}, obj);
  obj2.a = i; 
  // or const obj2 = { ...obj, a: i };
  setTimeout(console.log, 100, obj2);   
}
// will log {a:0}, {a:1}, {a:2}, {a:3}

And some goes for your code:

let download = function(url, filename, callback) {
   const options = {
       ...optionsImg,
       url: url,
   }
   /* alternatively you can do:
   const options = Object.assign({}, optionsImg);
   options.url = url; */
   request.head(options, (err, res, body) => {
      console.log('content-type:', res.headers['content-type']);
      console.log('content-length:', res.headers['content-length']);

      request(options).pipe(fs.createWriteStream(filename)).on('close', callback);
   });
};

Object.assign and {...optionsImg} do a shallow copy, so headers would still be re-used.

Now every time download() is called it and the request() calls it triggers will have 'its own' copy of the options object.


Edit: you don't need this for the stated problem in the OP, but should you also want to copy the headers object—perhaps you want to manipulate its value for every request—then you could do:

const options = {
  ...optionsImg,
  headers: { 
    ...optionsImg.headers,
    // as an example:
    'X-this-is-request-number': `${x} out of ${pics.length}`,
  },
  url: url,
};

You could also search for a recursive object cloning utility; there are many.

Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.