4

Scenario: You are building a large javascript-driven web application where you want as few page refreshes as possible. Imagine 80-100MB of unminified javascript, just for the sake of having a number to define "large".

My assumption is that if you lazy-load your javascript files you can get a better balance on your load times (meaning, you don't have to wait a few seconds each time the page refreshes), hopefully resulting in the user not really noticing a lag during loading. I'm guessing that in a scenario like this, lazy-loading would be more desireable than your typical single minified .js file.

Now, theoretically, there is a fixed cost for a request to any file on a given server, regardless of the file's size. So, too many requests would not be desirable. For example, if a small javascript file loads at the same time as 10 other small- or medium-sized files, it might be better to group them together to save on the cost of multiple requests.

My question is, assuming reasonable defaults (say the client has a 3-5Mbps connection and a decent piece of hardware), what is an ideal size of a file to request? Too large, and you are back to loading too much at one time; too small, and the cost of the request becomes more expensive than the amount of data you are getting back, reducing your data-per-second economy.

Edit: All the answers were fantastic. I only chose Ben's because he gave a specific number.

3
  • Great question! I am curious for the actual numbers here myself. </awkwardedly phrased sentence> Commented Aug 3, 2011 at 22:10
  • Nitpick: "mb" stands for millibit, "Mb" stands for megabit and "MB" stands for megabyte. May I assume that you actually mean 80-100MB and 3-5Mbps respectively? Commented Aug 3, 2011 at 22:25
  • @BalusC thanks for the clarification! It should read correctly now :) Commented Aug 3, 2011 at 22:34

3 Answers 3

3

Google's Page Speed initiative covers this in some detail:

http://code.google.com/speed/page-speed/docs/rules_intro.html

Specifically http://code.google.com/speed/page-speed/docs/payload.html

Sign up to request clarification or add additional context in comments.

Comments

2

I would try to keep the amount that needs to be loaded to show the page (even if just the loading indicator) under 300K. After that, I would pull down additional data in chunks of up to 5MB at a time, with a loading indicator (maybe a progress bar) shown. I've had 15MB downloads fail on coffee shop broadband wifi that otherwise seemed OK. If it was bad enough that <5MB downloads failed I probably wouldn't blame a website for not working.

I also would consider downloading two files at a time, beyond the initial <300K file, using a loader like LabJS or HeadJS to programmatically add script tags.

7 Comments

+1 for mentioning some good libraries. I've been looking into RequireJS myself. So if every download could be the same size, then ≈5MB would be your ideal size, or is that a maximum?
It would be my ideal size after the first 5MB was loaded. I might split the first 5MB into several chunks, though. For instance if I had something I could view after 200K, edit after 2MB, and offline-search after 100MB, I'd load 200K first, load 1.8KB second, and after that pull down the remaining 98MB in 5MB chunks.
Also after seeing Jethro Larson's link to Kyle Simpson's reddit comment, I would go with the two-files-at-once thing I hinted at earlier. I'd split up each stage into two requests. My numbers are guessed, but I hope you get the idea. Also with recognizing that the HTTP per-request load is lower, I would do 2 2.5-MB requests at a time instead of 1 5MB request.
I just realized you are talking about loading all the files asynchronously on site load. Would you still try to keep each download to 5MB if you load the resources on demand, only as you need them?
Yes, I probably would. I didn't realize that I got sidetracked into thinking about a different scenario. After the initial load, I'd load what needs to be loaded with a spinner or progress bar and bigger files, ideally split up into two. Again, my sizes are just guesses, though.
|
2

I think it's clear that making the client download more than a MB of js before they can do anything is bad. And also making the client download more of anything than is necessary is also bad. But there's a clear benefit to having it all cached.

Factors that will influence the numbers:

  1. Round-trip time
  2. Server response time
  3. Header Size (including cookies)
  4. Caching Technique
  5. Browser (see http://browserscope.com)

Balancing parallel downloading and different cache requirements are also factors to worry about. This was partially covered recently by Kyle Simpson here: http://www.reddit.com/r/javascript/comments/j7ng4/do_you_use_script_loaders/c29wza8

6 Comments

You mention caching technique. Do you mean the browser's method of caching?
It's browserscope.org. Here's the network section browserscope.org/?category=network Also I didn't know exactly what BrowserScope does but I've got a better idea now. Useful site. I also recommend saucelabs.com for testing your own site in other browsers.
@benekastah More in the things you can do to affect it. etags, expires, manual caching via localStorage, application cache, how you group files that change a lot and files that change rarely.
Ah, I see. Thanks, you guys, this has been a very helpful discussion!
Who is Kyle Simpson? Should we care?
|

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.