0

When running a micro time and catching it at the start of the a script then at the end of a script why does the time change every time you run the script?

Does it have to do with other items running? How it was processed?

1
  • 1
    How much time are you talking about? A couple milliseconds or seconds? Commented Oct 8, 2009 at 20:46

5 Answers 5

3

External factors cause time differences. Server load, memory management/paging are some examples of why it could be different.

Sign up to request clarification or add additional context in comments.

Comments

0

Probably both - could be due to other things going on with the server or it could be due to caching or other such things that would usually make the first run the slowest and subsequent runs faster.

Comments

0

There are way too many factors that influence the amount of time a single php request takes. As long as the differences are not obviously a sign that something funky is going on (one req takes 100ms, the next one takes 1800ms) they are quite normal.

Comments

0

It is normal to have little difference for external reasons as other stated.

But if you have big differences or if you want to find a possible bottleneck (network latency, database overloaded, disk I/O, etc.) you may need to do a deeper investigation.

To do that, you need to profile you script with xdebug or other related tool.

Comments

0

Like other said, lots of things can change script runtime. A big one is Disk I/O and database access, and relative server load. I find when benchmarking things to take several reads and average them out. And compare the averages when checking for slow down / speed up.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.