11

The thing: I have a page, which has to display undetermined number of images, loaded through AJAX (using base64 encoding on the server-side) one by one.

var position = 'front';
while(GLOB_PROCEED_FETCH)
{
    getImageRequest(position);
}

function getImageRequest(position)
{
    GLOB_IMG_CURR++;
$.ajax({
        url: urlAJAX + 'scan=' + position,
        method: 'GET',
        async: false,
        success: function(data) {
            if ((data.status == 'empty') || (GLOB_IMG_CURR > GLOB_IMG_MAX))
            {
                GLOB_PROCEED_FETCH = false;
                return true;
            }
            else if (data.status == 'success')
            {
                renderImageData(data);
            }
        }
    });
}

The problem is that images (constructed with the renderImageData() function) are appended (all together) to the certain DIV only when all images are fetched. I mean, there is no any DOM manipulation possible until the loop is over.

I need to load and display images one by one because of possible huge number of images, so I can't stack them until they all will be fetched.

2
  • You're starting a request in the while loop and this request starts another request if data.status == 'success' and no request when you're setting GLOB_PROCEED_FETCH = false which is the condition of the while loop. So what is the reason for the while loop at all? Commented Oct 12, 2013 at 8:24
  • Fixed the initial code Commented Oct 12, 2013 at 8:48

3 Answers 3

33

Your best bet would be to restructure your code to use async ajax calls and launch the next call when the first one completes and so on. This will allow the page to redisplay between image fetches.

This will also give the browser a chance to breathe and take care of its other housekeeping and not think that maybe it's locked up or hung.

And, use async: 'false' is a bad idea. I see no reason why properly structured code couldn't use asynchronous ajax calls here and not hang the browser while you're fetching this data.

You could do it with asynchronous ajax like this:

function getAllImages(position, maxImages) {
    var imgCount = 0;

    function getNextImage() {
        $.ajax({
            url: urlAJAX + 'scan=' + position,
            method: 'GET',
            async: true,
            success: function(data) {
                if (data.status == "success" && imgCount <= maxImages) {
                    ++imgCount;
                    renderImageData(data);
                    getNextImage();
                }
            }
        });
    }
    getNextImage();
}

// no while loop is needed
// just call getAllImages() and pass it the 
// position and the maxImages you want to retrieve
getAllImages('front', 20);

Also, while this may look like recursion, it isn't really recursion because of the async nature of the ajax call. getNextImage() has actually completed before the next one is called so it isn't technically recursion.

Sign up to request clarification or add additional context in comments.

10 Comments

while(GLOB_PROCEED_FETCH) { setTimeout(getImageRequest(position), 1); } - you mean to do requests in loop without recursive call in "success" handler?
@AlexShumilov - I added a code example using async AJAX which is much better.
I cannot set the "async: true", because number of desired AJAX calls is undetermined - if I set "async: true", it produces huge number of AJAX calls in the beginning, while GLOB_PROCEED_FETCH is not set.
@AlexShumilov - please look at my code example. It doesn't use the while loop. It launches the next ajax call when the previous one succeeds and stops when it hits the max images or doesn't get success. It can work with async: true.
@robisrob - Also, relevant to this issue is that getNextImage() runs, starts the ajax call and finishes execution all in one quick moment. The ajax success handler is called some time LATER after that invocation of getNextImage() has already completed. Thus, when the next invocation of getNextImage() is called from the success handler, there is no stack frame build-up as the stack has already unwound from the prior call to getNextImage() finishing execution - this is not really classic recursion. So, there is no memory leak or build up with this construct.
|
-2

Wrong and wrong. Don't user timers, don't chain them. Look at jQuery Deferred / when, it has everything you need.

var imgara = [];
for (image in imglist) {
  imgara[] = ajax call
}
$.when.apply($, imgara).done(function() {
  // do something
}).fail(function() {
  // do something else
});

3 Comments

If you read the OP's question carefully, this is NOT how to solve the OP's question. They say: I need to load and display images one by one because of possible huge number of images, so I can't stack them until they all will be fetched.. The OP also wants to append images, in order, as they are fetched (not waiting until everything is done and appending in order). Your suggestion does not meeting any of these requirements.
Actually if you learn how to use the deferred properly, you can do exactly what the OP described and handle each image as it is retrieved. I'm thinking you must be a timer writer, hoping that the .5 seconds per image estimated will never fail. The code above was not an answer, because I don't answer "givz me the codez" questions, it was a sample to point him in the proper direction.
Chris is the problem with this site.
-3

Try using setInterval() function instead of while().

var fetch = setInterval(loadImage, 2000);

function loadImage(){
    position= new position; //Change variable position here.
    getImageRequest(position);
    if(!GLOB_PROCEED_FETCH){
          clearInterval(fetch);
    }
}

5 Comments

How does this ever stop? It will go on forever. It doesn't observe any of the stop conditions the OP's code has. Also, if you use async ajax (which doesn't lock up the browser), there is no reason to use a timer either.
@jfriend00 Nice observation. Using the clearInterval, as added now. Thank you.
Thank you! It works, but somehow it still renders images strangely - it renders them by sets of two or three images. I mean in the very beginning ALL of the images were rendered in the very end, now they are rendered by sets of random number.
Increasing your interval may provide your browser more time to breathe. But I don't think, rendering them by sets of random number would be any problem, its fine right?
Thank you again! Actually yeah, as far as it won't cause any problem with logic

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.