So,basically what i want to do is send multiple get requests from the client to the web server all at once.My objective is to load test the server so it is important that all the requests are sent simultaneously. I have read about GIL so and it seems that multithreading would not be possible in python.Multiprocessing seems to be too heavy weight and the client ends up crashing when testing a large number of users.Is there a way to achieve parallelism using the threading module itself? What other alternatives do I have?
import http.client
import del1 #a file to delete all the files from the destination folder each time the program runs
import urllib.request
import time
import os
t=float(input('enter time in seconds\t'))
ip=input('enter server ip address(default value:192.0.0.1)\t')
n=input('enter no of users (default value:1)\t')
file_name=input('Enter the file name (default value:testing2.txt)\t')
pr=input('enter protocol (default value:http)\t')
if ip=='':
ip='192.0.0.1'
if file_name=='':
file_name='testing2.txt'
if pr=='':
pr='http'
if n=='':
n=1
s_ip = ' '
for i in range(3,3+int(n)):
s_ip='192.0.0.'+str(i)
os.system('sudo ip address add '+s_ip+'/24 dev ens33')
site = urllib.request.urlopen(link)
num = 0
print("file size:"+str(site.length))
t_end = time.time()+t
con = http.client.HTTPConnection(ip,source_address=(s_ip,80))
while time.time()<t_end:
con.request("GET","/"+file_name)
r1 = con.getresponse()
data1 = r1.read()
handle = open('/home/client1/test/try'+str(num)+'.txt','wb')
handle.write(data1)
num+=1
print("number of files downloaded is:"+str(num))
through = (num*(site.length)/t)
through=through/(1024*1024)
through=round(through,4)
print("Throughput = "+str(through)+"MBps")
This is my code until now. It works for a single user. But I don't know how to go about it for multiple users.Would using a different implementation of python like Jython be useful?