0

I have a file that will be called via CRON for 30 to 30 minutes, but would like some tips to improve it, because the way he is is VERY slow.

His logic is simple:

  1. Search the database all records with status not ready

  2. If records exist, it does a foreach, and for each instance of this class it calls another class to do a curl to grab an xml transforms it into md5 hash and compares the hash with the current database.

  3. If the hashes are different it does other things

The problem is that this process is very slow, if someone can improve this?

<?php

require_once($_SERVER['DOCUMENT_ROOT'] . 'setup.php');

$MySQL = new MySQL();
$resultados = $MySQL->query("SELECT * FROM objetos WHERE situacao != 'entregue' AND email = 1")->fetchAll(PDO::FETCH_ASSOC);

if ($resultados) {
    foreach ($resultados as $resultado) {
        $Rastreio = new Correios(new cURL(), $resultado['cod_objeto']);

        if ($Rastreio->resultado['hash'] != $resultado['hash']) {

            $user = $MySQL->query("SELECT  nome, email FROM usuarios WHERE id_usuario = {$resultado['fk_id_usuario']}")->fetch(PDO::FETCH_LAZY);

        // Doing something different

            $MySQL->exec("UPDATE objetos SET situacao = $Rastreio->resultado['status'], hash = {$Rastreio->$resultado['hash']} WHERE cod_objeto = {$resultado['cod_objeto']} AND fk_id_usuario = {$resultado['fk_id_usuario']}");
        }
    }
    } 

All answers were useful. I think the problem is in the database, my queries are not optimized as well. I'll try what you told ....

2
  • 2
    Try doing some profiling and seeing where most of the time is being spent. I suspect it is either the curl or you simply have or $resultados is simply a very large array in which case there might not be much you can do about it. Commented Oct 14, 2012 at 4:09
  • 2
    A simple way to test for bottlenecks can be logging the timestamps after each part of the script is executed. This way you can check if the "problem" lies in querying the database or the curl part. Commented Oct 14, 2012 at 4:17

1 Answer 1

1

This is not a complete answer, but I can't comment, so it will have to be here.

Some optimizations can be achieved:

  • Do you really need SELECT *? Can't it be more specific for the fields that you need?
  • Is the condition AND email = 1 really needed?
  • Are you reusing the cURL session or creating a new one for each request?
  • You can unset rows from $resultados after each iteration so the set is smaller.
    The gain may be small, but still may help.
  • Are you retrieving unnecessary things in your cURL request?
    Like headers if you don't need them.
  • If you are logging the cURL errors, do you have access to the file?
    Are you appending or rewriting?

There may be other things to consider, but these questions should be the first things to check.

Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.