I'm trying to test the speed of my servers, and compare it with third parties' servers including Facebook, Tumblr and Google. I need to build a report with at least 100's of requests and I decided to do this with Python.
The idea is that I upload an image to Facebook, then re-download it (as Facebook modifies the image I upload) and upload it to the other third party servers as well as my own. I request the file x times for each server, and Python will print the time the request took each time.
Here is my script:
from time import time
from urllib import urlopen
# vars
url = raw_input("Please enter the URL you want to test: ")
for i in range(0,100):
start_time = time()
pic = urlopen(url)
if pic.getcode() == 200:
delta_time = time() - start_time
print "%d" % (delta_time * 100)
else:
print "error"
print "%d requests made. File size: %d B" % (i, len(pic.read()))
I'm not great at Python so I'm not sure if I'm doing this right.
Is this the best way to do this?