9

When building advanced JS-interfaces and games I've found that I have to dig deeper in to how browsers handles memory for JS. My experience with memory and JavaScript is that the memory gets glogged (and makes animations and calculation slow/lagging ) when:

  • There is alot of JS-generated content on the page
  • There is alot of graphics (img-elements) on the page?

I have therefore concluded that if I want to keep my memory fresh I should include as much HTML code as possible from the beginning of the document as it will be cached and not kept in memory. And off course remove all currently not used elements.

Does anyone have any more info on this? Resources? Links?

2
  • What exactly do you mean by "cached and not kept in memory"? That doesn't make any sense to me. Commented Nov 2, 2009 at 15:07
  • I'm no hardware pro in this one. I guess I thoughgt that there is a difference if a file is read from cache or memory. But then again, maybe all cached files are passed through memory. Or maybe memory is the same as cache in this example. As I said, I don't really know how it all fits together. That's why I'm asking :) Commented Nov 2, 2009 at 15:43

3 Answers 3

8

Some things to keep in mind:

  • IE gets killed by DOM complexity. The more elements are part of the page, the slower it gets. I've seen pages slow down noticeably with as little as 3000 elements on them (if you have a grid with 10 columns and 100 rows, that's 1000 elements right there). The right approach is typically to unload hidden parts from the DOM (detach them)
  • IE also has a long history of not correctly freeing HTML elements if they have javascript handlers attached. If you have a long-lived page that's often refreshed, read up on IE memory leaks, and how to work around those issues.
  • All browsers store images uncompressed in memory. So, if you're preloading a gazillion large images in the background, that's generally a bad idea.
  • Updating DOM properties will cause page reflows, which on complex pages can take a long time. Sometimes even fetching DOM properties (e.g. offsetHeight) will be very very slow.

Generally, javascript itself is not a performance bottleneck. What kills it is the interaction with the DOM. Code that doesn't touch the DOM rarely has performance issues. There are only rules of thumb here: interact with the DOM as rarely as possible, keep DOM complexity as low as possible, avoid repeated page reflows.

Sign up to request clarification or add additional context in comments.

7 Comments

Allright. So the interaction with the DOM is an issue also. Especially a large one. Good to know. What happens if I change/reduce the DOM-tree? Does that add to the memory (stores revisions?) or does that really lighten the load? With "page reflows" you mean an update of all the documents elements according to the visual formatting model?
The DOM tree is not revisioned. If you detach something from the DOM tree, and don't have any references left to it from javascript (not as easy as it sounds, because of prototype chains), then it will be garbage-collected.
Great! I can still manually garbage collect it to be sure though somehow? I remember that back in the days that's what people did with certain situation in IE.
I wonder if itäs just structural changes to the DOM that matters, like removing or adding new elements. Or are changes to existing attributes (like the class-attribute) also forcing a redraw? Man do I appreciate the expert info! :)
|
3

For starters. All HTML, whether it is "include from the beginning" or not, is kept in memory. Most likely also all image content for the current page. At a bare minimum, everything that you see on-screen at any given time is kept in memory at that time.

3 Comments

I just though that there is a limit to how the browsers handles memory. I mean, you can do alot more intense scripting in flash than than you can in JavaScript before you run in to lagging problems. Something is holding my scripting back right? Some restrictions to browser scripting in generall?
You can do a lot more intense scripting in Flash because it has a very fast script engine. A variant on that engine (Tamarin) is now in Firefox, while Safari and Chrome both have ludicrously fast new ones of their own. Alas Internet Explorer is lagging behind somewhat.
Ah, I c. So it's the engine that's slow and old. Figures. I really like Chromes engine.
2

It tends to depend more on what you're doing with it to be honest. A lot of graphics aren't going to do squat to javascript if you aren't interacting with them but if you've got an enormous page filled with different elements and you're searching the entire document for a single element then that's a different thing entirely.

I've had problems doing things like adding massive amounts of events to pages. Running too many loops in a page and too many timers.

If javascript performance is an issue and you're planning on doing intensive javascript, you might want to look at webworkers. Here's a few more links on webworkers:

1 Comment

Using webworkers isn't main stream yet. Modern browsers are yet to reach the rural world of web.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.