1

I'm using basic cURL requests to fetch webpages in PHP, however these webpages are big in size and I'm limited in bandwidth usage.

Is there a way to reduce/optimize cURL data usage, for example using compression. I also heard that Brotli compression is the best, but I'm not sure how to use it.

1 Answer 1

2
$headers[] = "Accept-Encoding: gzip"; // tell the server you accept gzip
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_HTTPHEADER, $headers);
curl_setopt($ch, CURLOPT_BINARYTRANSFER, true);
curl_setopt($ch,CURLOPT_ENCODING , "gzip"); // tells curl to gunzip it automatically
$data = curl_exec($ch);

Not tried this with brotli, support will vary by software version which you didn't tell us about.

Sign up to request clarification or add additional context in comments.

3 Comments

Do you have an estimate percentage of bandwidth that gzip could save me? Also I'm using Apache...
No - because it depends on your content. For 7-bit natural language, I would expect in the region of 70-90% reduction. Gif, png and jpg data is already highly compressed and gzipping will make them slightly larger (hence a correctly configured webserver won't attempt to compress these).
Hmm I see, is there a way to determine how much bandwidth cURL uses (in PHP) to test the difference?

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.