When running a micro time and catching it at the start of the a script then at the end of a script why does the time change every time you run the script?
Does it have to do with other items running? How it was processed?
When running a micro time and catching it at the start of the a script then at the end of a script why does the time change every time you run the script?
Does it have to do with other items running? How it was processed?
It is normal to have little difference for external reasons as other stated.
But if you have big differences or if you want to find a possible bottleneck (network latency, database overloaded, disk I/O, etc.) you may need to do a deeper investigation.
To do that, you need to profile you script with xdebug or other related tool.