0

I've got this script that will generate a thumbnail for any URL passed through it (i.e. my_script.php?url=http://www.google.com

It works, but I want to modify it so that I can pass a large amount (2,100) of URLS through it and generate a screen shot for each of them. It already saves these images in separate folders as well.

Here is the pertinent code:

// If we're requesting
if ($_GET['auth'] == 'todc_admin') {
    if (isset($_GET['id'])) {
        $wb = new WebThumb();
        $wb->setApi("API-KEY-HERE");
        $job = $wb->requestThumbnail($_GET['url']);
        $job_id = $job[0]['id'];
        while (true) {
            $job_status = $wb->requestStatus($job_id);
            $status = $job_status['status'];
            if ($status == "Complete") {
                break;
            } else {
                sleep(5);
                continue;
            }
        }
        // Image generated so lets save it
        //header("Content-type: image/jpeg");
        $wb = new WebThumb();
        $wb->setApi("API-KEY-HERE");
        $filename = ABSPATH . todc_design_dir($_GET['id']) .'/thumbnail.jpg';
        $imgurl = todc_design_dir($_GET['id']) .'/thumbnail.jpg';
        $content = $wb->getThumbnail($job_id, "large");
        if (file_put_contents($filename, $content)) {
            echo '<img src="http://www.myurl.com'. $imgurl .'" />';
        }
    }
}

I'm also able to generate a list of all the urls I need to create thumbnails for using this:

$designs = get_posts( array('post_type' => 'design', 'post_status' => 'publish', 'orderby' => 'date', 'showposts' => -1) );

    foreach($designs as $design) { 

            $previewlink = get_bloginfo('wpurl') . todc_design_dir($design->ID)

Then echo $previewlink wherever I need it.

I'm just struggling to put the two together.

Any thoughts?

0

2 Answers 2

2

You could pass the urls as a json-encoded array, which you then json_decode into an array in the script. You then would use a for-each to iterate through each url.

Also, you should use POST for such a large amount of data, as GET has a maximum datasize limit.

$urls = json_decode($_POST['url']);
foreach ($urls as $url) {
    $job = $wb->requestThumbnail($url);
    // rest of code
}

You may also need to increase the script's max execution time, depending on how long 100 urls would take to process; use set_time_limit(int $seconds) for this.

Sign up to request clarification or add additional context in comments.

Comments

0

first thought is that this sounds process intensive. Doing it through your web browser is prone to let php max out on memory and time limits. A better option could be to store the urls in a database and run the batch as a forked process.

1 Comment

I should specify that this would be a one time job. In future, I'll only need to do this one at a time. I recently changed the size of the image the script generates due to a new site layout, which has made all the old images pixelated!

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.