0

I thoroughly check my website 100 times in the past few days with google Pagespeed Insights. On the mobile tests, it keeps pointing out that I have long tasks that take roughly 60-70ms to execute. I think their threshold is 50ms. According to their desktop page, I hardly see such an error.

When I investigated my own javascript code, the code that took the longest was concatenating strings together, and I do this to load the background image to my website.

Here's what I did.

  1. I made a picture of a small file size as a background
  2. I base64 encoded it
  3. I replaced repeating single characters with function calls to generate the repetition (this alone saved a few hundred bytes).

In the end,the total string size (not including repeats) is about 2000 bytes.

My code is roughly the code shown below (except that actual picture data is used):

function copy(x,y){return x.repeat(y)}

console.time("T");
document.body.style.background="url('data:image/png;base64,UNIZQUE"+copy("A",500)+"UNIAQUE"+copy("B",250)+"UNIQUE"+copy("C",150)+"UNIBQUE')";
console.timeEnd("T");

When I measured the time with my web browser, the log reported a time between 250ms and 300ms.

Is there any way I can cut down the timing of the string processing without inflating the size of the HTML?

I investigated different concatenation methods here on stackoverflow, and the stats reported offered no help because I'm already using the recommended form of concatenation (the +).

6
  • 3
    Why do you think you have to save a few hundred bytes (not kilobytes, right?) in your HTML? Could be a xyproblem Commented Oct 25, 2024 at 20:15
  • 1
    Array.join is faster than string concatenation, but that won't matter for PageSpeed. More important is to reduce size and number of requests Commented Oct 25, 2024 at 21:10
  • Davi, People are in an affordability crisis. According to whatdoesmysitecost.com, the average HTML code of a webpage costs a mobile user up to 32 cents to load. My HTML pages probably cost them 1 - 4 cents to load. Commented Oct 25, 2024 at 21:14
  • 1
    @mike_s those statistics are highly questionable, and fail to take into account extremely important considerations like client caching of popular libraries. Commented Oct 25, 2024 at 23:39
  • 1
    As mentioned before your server should be using gzip compression to serve files. That means that your fully expanded base64 image gets crunched down quite a lot in transit, automatically. Commented Oct 26, 2024 at 14:02

0

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.